code 107: "Rate limit exceeded. Slow down cowboy."

  • 52 results
  • 1
  • 2
Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

A few of the users of my app have started reporting a new error code, "code 107: Rate limit exceeded. Slow down cowboy."

Looks like you guys have implemented a good way to stop people from abusing the web API.

That means it's time for me to make some changes to my app, like throttling API hits so users don't accidentally exceed their rate limit, or providing sensible rate limit error messages so that users know what is expected of them. But I can't do those things effectively without some more details.

Could someone in the know (@mrpibb, perhaps?) please tell us more about how this all works?

  • what exactly causes a user to exceed the rate limit? is it based on the speed that API requests come in? or the total number of request per day?
  • once a user loses access because they've exceeded their rate limit, how long does it take before they have access again?
  • does the API key used have anything to do with the rate limit count, or is it just based on the number of hits from a users IP address?

Thanks!

Avatar image for deactivated-5fbfd5d291164
deactivated-5fbfd5d291164

12702

Forum Posts

1547

Wiki Points

0

Followers

Reviews: 74

User Lists: 7

@cbanack: Hey.. just as a reminder last I was told callouts in the first post don't work.

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3  Edited By cbanack

@cbanack: Hey.. just as a reminder last I was told callouts in the first post don't work.

Thanks for letting me know! I'll guess I'll try a second callout to @mrpibb then, in case he didn't get the first one. :)

Avatar image for mrpibb
mrpibb

513

Forum Posts

181

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4  Edited By mrpibb

@cbanack: hey, we're allowing 200 per 15 minutes per API key (which should also get users off your key onto their own). Let's chat on email if this is too few, in monitoring the cvscraper forums, it seems that only users doing complete series redos are having issues.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 jslack  Moderator

@cbanack: As @mrpibb said, we are doing 200 per api key per 15 minutes. This was a needed thing, as we were too generous in the past with our API, and some of the keys were being abused. As we are working on finishing up the API v2, we'll also have new documentation and such to go along with it.

If there's a good reason why you are going over the limit, you can contact us, and we'll figure something out. I don't feel that 200 / 15 minutes is a very restrictive limit. If it is, get at me, and we'll work it out.

Avatar image for comictagger
comictagger

39

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@mrpibb@jslack It would be great if you guys could give us a little advance notice on this forum of API changes like this before going live. That would give those us who have apps that talk to the API some lead time on having fixes ready when the changes go up. Maybe just a new sticky thread "Upcoming API changes" with some minimal details on each update? Thanks!

Avatar image for scottchilders
scottchilders

4

Forum Posts

9216

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7  Edited By scottchilders

I'm not a coder so forgive me if this is impractical, but could those who help contribute to the database get slightly higher limits?

This limiting is encouraging people to get their own API key anyway, so why not give those who keep the database up to date a little reward?

Again - I know nothing about how the API is coded, but if it was possible it could be an encouragement to sometimes contributors (like myself) to get more active in filling in the holes in database coverage.

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@scottchilders said:

I'm not a coder so forgive me if this is impractical, but could those who help contribute to the database get slightly higher limits?

That's a good idea!

In a similar vein, maybe API keys associated with accounts that have purchased Comic Vine premium membership should get more access, too?

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#9 jslack  Moderator

@comictagger: It was necessary for e3, and we were seeing way too much abuse on comicvine with our API. CV Scraper, and other apps that share an API key are extremely aggressive, and don't do any kind of throttling when requests fail.

Your app should be storing data after retrieving data from the API, and each client should not be doing live requests to the API on every page load.

We are rolling out a new API, and it will feature documentation and information about limits. What we have now is a stopgap solution. We are keeping an eye on requests, failures and abusive keys so we can get a better idea of our final implementation requirements.

@comictagger What is your app? Like I said above, go ahead and PM me your requirements and we can make it work.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#10 jslack  Moderator

@scottchilders: Yes, we can do something like that, and it's a good idea. The hard part with that, is to make sure we aren't rewarding "garbage" edits. We have a pretty simple karma system currently, we just have to make sure the type of system you describe doesn't encourage garbage contributions. But I have been thinking about a karma based timeout system as well. We'll do a post on it later, and I'd like you to contribute your thoughts.

Avatar image for evilhero
evilhero

1

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

In order to facilitate users getting their own API key quicker when using these types of applications - is there a way to reset an API Key so that it generates a new one ?

This would help in forcing them to get their own key or the application won't work as the developers originally had intended.

Avatar image for comictagger
comictagger

39

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12  Edited By comictagger

@jslack I have no issues with the change itself. The limitation seems reasonable for the usage on my app, ComicTagger (https://code.google.com/p/comictagger/), which is similar in functionality to the Comic Vine Scraper plugin for ComicRack. The app currently does a lot of caching of results, so that for a tagging session, most of the searching doesn't need to be repeated. But of course, with a single key for all users, it's not enough. Since I can't control which users are using that key simultaneously, the best solution is let them provide their own API key.

(I actually posted an question on this board (with my prior user ID) about a year and half ago addressing this issue: http://www.comicvine.com/forums/api-developers-2334/api-key-usage-719514/#2 . With no response I went ahead with a single API key for the app.)

So, at any rate, my current concern is just getting hit with a change to the functionality of the CV interface without any warning, as I started getting postings on my issues forum and emails about a broken app out of the blue. In this case, I'm going to have add a configurable-by-the-user CV API key setting (and properly educate the user on the use of it), and handle the new error properly. None of this is rocket-science, but it takes time and testing on multiple platforms. With a little lead time, I might have avoided a temporarily broken app. So my only request is, going forward, give us all a little "heads up" on stuff like this.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#13 jslack  Moderator

@comictagger: The choice was have the entire site fall down, or implement a limit. In the past, users have been fairly respectful, but it's been getting worse and worse.

I agree, a heads up would have been nice. But we've been communicating with several people who are our heavier users. (CV Scraper), we browse, and post in forums of people using our API, and try to stay on top of this. We implemented this on CV as kind of an emergency fix, to stop the site falling over (which we hear from users all the time).

We'll write a post about it to let you know of the new official limits. What we have now is a test run.

And yes, being able to have users add their own API key to their configs in your app would be a good idea.

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jslack said:

@comictagger: It was necessary for e3, and we were seeing way too much abuse on comicvine with our API. CV Scraper, and other apps that share an API key are extremely aggressive, and don't do any kind of throttling when requests fail.

Your app should be storing data after retrieving data from the API, and each client should not be doing live requests to the API on every page load.

@jslack: re: extremely aggressive: Now that I have an error code for the "rate limit" failure, the upcoming release of CV Scraper will stop scraping when the first rate limit failure occurs. That should tone things down in the near future. Right now, many users are only discovering the new rate limit after they've come back from a scrape of a zillion comics and noticed that every one of them failed. I'll also take a look and see if I can do a better job of cancelling the entire operation whenever a series of failures happen, that way CV Scraper doesn't keep trying to request data when something is clearly wrong.

CV Scraper definitely does cache each page--there are no "page loads" per se, since it is a batch updater, but it doesn't load any data a second time, unless the user explicitly "re-scrapes" a file at some point in the future.

Do you think it would be helpful if I throttled CV Scraper so that individual batches of scraped are processed more slowly? I could easily put in some kind of a delay between every 5th or 6th request.

As I mentioned in my PM, the limit of 200 hits every 15 minutes is falling just under what many of my (not heavy) users need. Would you consider switching it to be 800 hits every 60 minutes, instead? That would allow more breathing room for most users, and I don't think it would lead to significantly more load on your servers.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#15 jslack  Moderator

@cbanack: Hey, happy to work with you (and @comictagger) on your needs.

We can look at a couple of different options to help you out. We just want to make sure that API hits don't affect our overall site performance (soon this won't be a problem any more, and we can increase the API rate limit). For now, is there anything that would help you in the error message? I have been hesitant to provide exact data in the error message, such as: "You've exceeded the 200 request / per 15 min limit, you have 11:59 remaining on your cooldown", but maybe that's not so bad, and could be helpful for you? Let me know what we can do to help you on that, so it's easier for your app to know when to stop requests, and when to retry. Maybe that cooldown count would really help you.

As far as the batches are concerned, we can probably work something out. During non peak hours, we can get away with more traffic on the API. Perhaps we can look at changing the API limit for certain keys (like yours, and other strong contributors) on certain evenings, so you are free to go crazy over night and get your data populated.

800/60 isn't too bad. I planned on having different tiers anyway. As others have mentioned, contributing users and other active comicvine users should be rewarded for good behavior and excellent submissions, so I'd like to be able to increase the limit for those.

@all: I'd like to clarify that we will soon have a new API platform, and it will be de-coupled from our other site requests, thus it will stop impacting site performance when we get hit hard. Worst case scenario is that the API becomes unavailable with a message: "Our API is under heavy load, so we are limiting you to 200/15". The current solution is a stopgap while I iron a few things out. I apologize if it's affected your apps, it's temporary while we work this out. It just really sucks when we have to monitor boxes on certain days, and watch site performance closely in fear that we are getting crawled hard.

Now that the busy days of E3 are wrapping up, I have some time to finish up on the new API stuff. I'll put some thoughts together in the forum and have you guys collaborate on what you want. I'd like to make the API unlimited, up until it affects other users, and only throttle at that point - that's the ideal scenario.

Avatar image for comictagger
comictagger

39

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jslack Very cool. I really do appreciate all of the work there, and this amazing free service as well.

My build machine is currently being.. rebuilt, but once that's done, and I have some time, I want to profile the auto-tagging process on my app, to get a better sense of average API hits that are occurring, and get back you.

Also, I like the idea of communicating a cooldown time: that way the automated processes can respect that without having to keep hitting the API

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#17  Edited By cbanack

@jslack: CV Scraper guy here.

I'm ambivalent about adding more details to the error message. Don't get me wrong, if the error tells me how long is remaining in the cooldown period, I would definitely report that to users in the error dialog that I show them. But I don't think it will modify their behaviour very much--they'll still download comic details as fast as they can until they either a) run out of new comics or b) hit the rate limit.

The new version of my app (will be released tonight) will definitely stop ALL requests as soon as the rate limit is hit (or when the API key is invalid, etc). I looked again at my error handling last night and I'm embarrassed to admit that the aggressive "retrying" in the face of an unresponsive database was a totally unintended bug. As the latest update of CV Scraper is adopted, you should see much, much less of that.

I've also throttled the connection a little, so requests won't come in at top speed any more.

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jslack said:

As far as the batches are concerned, we can probably work something out. During non peak hours, we can get away with more traffic on the API. Perhaps we can look at changing the API limit for certain keys (like yours, and other strong contributors) on certain evenings, so you are free to go crazy over night and get your data populated.

800/60 isn't too bad. I planned on having different tiers anyway. As others have mentioned, contributing users and other active comicvine users should be rewarded for good behavior and excellent submissions, so I'd like to be able to increase the limit for those.

I don't know if there's much point in increasing the limit for my key. There are (best guess) well over 1000 users who are trying to use that key right now, so it is basically always at its limit. You'd have to add a LOT to the limit for that key before it became regularly usable again...and then if my app ever gets more popular, you'd just have to keep bumping the limit up.

As I mentioned in my PM, there's really two solutions here: 1) you guys change the rate limiting based on user IP address instead of API key, or 2) I change CV Scraper to require EVERYONE to get their own API key. I've already implemented option 2, but I haven't put it out yet because I wanted to see what you guys thought of these options. (Obviously option 1 would be harder for you, but easier for my users.)

As for the rate limiting, I'm happy to hear that you're considering 800/60. I also think having the limit automatically go up at night or something like that would be great. If you try some higher limits, I think you'll find the load on your server doesn't go up too much. That's because I believe that most of the load is coming from a small number of users who are frequently re-scraping their entire (tens of thousands) collections. Any limit, even a high one, is going to foil those users, so you should see a big improvement regardless.

Avatar image for yobuddy
yobuddy

2

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19  Edited By yobuddy

Is there any plans to relax the limits at all? I just downloaded comic rack, but organizing 20,000 files is going to be painful if I'm stuck at 200 calls every 15 mins. I'd gladly pay for a CV membership if limits would be removed.

Avatar image for dwake
dwake

2

Forum Posts

3

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20  Edited By dwake

@yobuddy: Agree. I actually went ahead and did an annual membership, but the limit is still enforced on my account's API key. I use the site regularly and don't mind paying for convenience.

Avatar image for ericcholis
ericcholis

1

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21  Edited By ericcholis

@cbanack said:

As for the rate limiting, I'm happy to hear that you're considering 800/60. I also think having the limit automatically go up at night or something like that would be great. If you try some higher limits, I think you'll find the load on your server doesn't go up too much. That's because I believe that most of the load is coming from a small number of users who are frequently re-scraping their entire (tens of thousands) collections. Any limit, even a high one, is going to foil those users, so you should see a big improvement regardless.

Agreed, the 800/60 would be a nice increase. In my case, I'm attempting to simply get a list of all characters their aliases and real names. This is about 800 pages of 100 results each. For the time being, I'll throttle my application to pause my requests until the rate limit is released.

Also, I would suggest a ping request that doesn't count against an API Key's rate limit and would return the availability of the API. Possibly it could return the rate limit details (ie: number of requests allotted and number used).

Avatar image for 69samael69
69Samael69

20

Forum Posts

2529

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22  Edited By 69Samael69

I'll come forward and admit it, my process is one of those which would be considered very aggressive, but there is a good explanation for it. It takes the data in ComicRack that was scraped with the ComicVine scraper and queries ComicVine looking for missing issues in a collection. There are built in scripts for ComicRack that will find gaps in volumes, but nothing that will find missing issues in a collection at the end of a volume or if "issue number" is something other than a number. This is especially useful for volumes that have been on hiatus for a while and suddenly become active again and those released infrequently/irregularly. For this you need to query ComicVine to get a current list of issue numbers in a given volume and compare it against the local collection. It's an extremely useful tool for active collections, but can generate A LOT of queries to the API, especially if a user is requerying the entire collection. This is the reason. At the time I wrote it, the ComicVine volume ID was not stored in ComicRack in any usable way so I had to query every issue in order to find which volume to place it. I store select volume information, basically title, start year and the associate issue numbers, in local cache files and it defaults to incremental updates which is typically not a huge amount of querying. I also suggested people only do full updates every 6 months or so because it's simply overkill to do it more often.

Some time over the last year custom fields were implemented in ComicRack and now the Scraper is storing the volume ID for each issue. Everything that does not have this field will unfortunately have to be rescraped to create these custom fields, but this is a one time pain. For a large collection, this change will cut the number of required queries for a full build down from 10s of thousands to potentially only hundreds (Essentially, the number of volumes). I've been looking at reducing this further by using the last modified date on the volume and skipping volume queries for those that have not been updated in a given amount of time. This has proven to be problematic, but it's still on my radar. I've now implemented an internal speed limit so my users can define a delay between queries, lessening the load on ComicVine and my users will have to get their own API key. I'm hoping to complete changes and release the new version within the next week or so, at which point I'll likely ask to have my current API key disabled and changed in order to force people to get their own. Like previous incarnations of the scraper, my key is hard coded in to the process.

I also support the 800/60 limit as I think this would be easily sufficient for most of my users. Even for several thousand volumes that might be in extremely large collections, this is manageable if they add a few seconds of delay between queries to lighten the query load. While I don't use it much anymore myself and I have no idea how many people do use it, I don't think many, I continue to maintain it for those few who do.

Avatar image for percussionmasta
percussionmasta

6

Forum Posts

871

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I just tried to scrape data for 24 issues (haven't scrapped anything since last week). It did the first 23 issues just fine, and on the 24th I received the rate limit error. I'm using my personal API key.

This seems unreasonable.

Avatar image for dylanhensler
DylanHensler

1

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'm having a similar issue - tried to scrape 40 issues with Comic Vine Scraper, and got the message about exceeding my maximum limit after 10 issues were sucessful. The limit definitely seems like it's less than 200 / 15 minutes, or the Scraper is sending 20 requests per issue.

Avatar image for 69samael69
69Samael69

20

Forum Posts

2529

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I've typically been around 100 before hitting the limit. I've assumed the scrapper makes 2 calls. Probably one to see if the API is alive and another for the actual query. It's going to take a LOOOONNNGGGGGG time to rescrape everything to bring in those custom fields to ComicRack.

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26  Edited By cbanack

The CV Scraper uses about 5-6 API calls new per scraped comic (for searching, paging through search results, cover art, and obtaining issue details). This number will be higher if you are loading titles for all issues in a series/volume, browsing through issues, re-searching with new search terms, etc. But the average seems to work out to around 30-40 comics per 15 minutes.

Also, as 69Samael69 noticed, it uses 2 calls per comic per rescraped comic.

One of the CV developers mentioned that he was considering bumping the limit to 800/60, which would allow about 120 new comics per hour. @jslack, do you still think this is going to happen? How bad was the load on Wednesday?

Avatar image for leperwdup
leperwdup

4

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

CV Scraper is quite horrible with these limits. As I am trying to reorganize my whole collection and I am only able to do about 40 comics per 15min, so it is going to be a very long process. I hope the limits are getting bumped soon or they bring out some form of tiered API limits.

Avatar image for knightmare187
knightmare187

3

Forum Posts

14

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I added 94 new comics to my comic rack and used the new Comic vine scraper. I reached the limit at 33 issues. Then the second time at 39 issues. Then finished it off. So with the new limits it seem I will only be allowed 1 weeks worth of comics every 15 min or so. I just wanted to post my results. I only do this about once a month, maybe twice.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#29 jslack  Moderator

@leperwdup@knightmare187@cbanack@69samael69@dylanhensler@percussionmasta and all others:

As you guys have mentioned, I can't control how many requests each of these 3rd party apps make to the endpoint. It's likely they call the api 5 or 6 times in a row for a single result, in which case API limits exhaust quickly.

It's definitely suggested that your apps will allow the user to add their own key, which will be used instead of the hard-coded keys.

These API limits are intended to stop bots, DDoSers, spammers and others who are sharing API keys unreasonably. It's not intended to stop any normal behavior you are entitled to as a user. Our current limit it the same as Twitter's API limit, and similar to google's (Maps, and other service) and facebook's. Like I said previously, we'll have more information coming soon. I'm going to bump up the limit slightly as a test run and see how it goes.

You guys will hear more on this later this week. Don't hesitate to PM me and I can get back to you.

Avatar image for leperwdup
leperwdup

4

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

Thanks for the reply @jslack. Just a bit annoying as I have only just gotten back into comics after a 10 year hiatus and getting my new collection in order is turning out to be a huge pain with these limits. But I can see why you had to put them in place due to the rampart problems with DDosers and spam.

I look forward to updates about this in the future.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#31 jslack  Moderator

@leperwdup: I hear ya. I too left comic books as a boy. It's super hard to get back into, without having to read a trillion books.

If you are using CV Scraper, I think it's still using a shared API key. Once it gets updated to allow a user-specific key, it will solve your problems.

@cbanack We've got a patch going out which will slightly raise API limits for a while. Should be live within 1-2 days.

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jslack said:

@leperwdup: I hear ya. I too left comic books as a boy. It's super hard to get back into, without having to read a trillion books.

If you are using CV Scraper, I think it's still using a shared API key. Once it gets updated to allow a user-specific key, it will solve your problems.

@cbanack We've got a patch going out which will slightly raise API limits for a while. Should be live within 1-2 days.

The latest version of CV Scraper requires you to use your own API key. However, older versions of the scraper still use a shared key (mine). It would be really helpful if you guys (@jslack or @mrpibb) could disable or better yet change my API key so that people are forced to update to the latest version of CV Scraper. This should help you guys, too, because the newest version of CV Scraper accesses the CV API more efficiently than previous versions.

Avatar image for 69samael69
69Samael69

20

Forum Posts

2529

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33  Edited By 69Samael69
@cbanack said:

The latest version of CV Scraper requires you to use your own API key. However, older versions of the scraper still use a shared key (mine). It would be really helpful if you guys (@jslack or @mrpibb) could disable or better yet change my API key so that people are forced to update to the latest version of CV Scraper. This should help you guys, too, because the newest version of CV Scraper accesses the CV API more efficiently than previous versions.

I was going to ask for the same, to have my key killed so I can request a new one, also to force people using my script to update.

Avatar image for matthewlupo
matthewlupo

29

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Hello all,

Just wanted to say think you to @jslack for getting back to everyone so quickly with responses. I've been monitoring this thread with a lot of interest, as my application has been having problems with running over the API request limit. I find I can just barely tag the amount of comics I would like with my app within the 15 minutes, so like everyone else is saying, at least the 800/60 option would help out. At least until the API is separated from the main site. Thanks again for everything.

Avatar image for mrpibb
mrpibb

513

Forum Posts

181

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Avatar image for cbandes
cbandes

20

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

One thing that would help is if it was explicitly mentioned in the tos that it's ok to cache and/or locally store the data. My app really doesn't need to connect to the api at runtime as long as I can update my data periodically and then serve users the cached data instead of hitting the api directly.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#37 jslack  Moderator

@cbandes: Your right, it would be nice if the legalize said something like that. But, we actually prefer if you store the data locally (localstorage, CoreData, SQLite or some other caching solution). As long as you don't mislead people where the data is coming from, or use it for some other bad purpose, it's ok. Basically, don't be evil.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#38 jslack  Moderator

@matthewlupo: No problem, we are rolling out a larger limit today to give it a shot. Expect it to be live by this evening.

Avatar image for 69samael69
69Samael69

20

Forum Posts

2529

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jslack Are you moving to the 800/60 limit?

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@mrpibb said:

@cbanack: @69samael69: send me a PM and I'll change your API keys.

I've sent you a PM, @mrpibb. Actually, I've sent several PM's to you and @jslack over the last week or two, but I haven't got any responses to any of them. Is that just because you guys are super busy and your inboxes are full of spurious PMs (which I quite understand), or is possible that my PM's aren't being delivered? Just thought I'd better check.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41  Edited By jslack  Moderator

@69samael69: Experimenting with 400/15 - as a test & it's live

Avatar image for mrpibb
mrpibb

513

Forum Posts

181

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

sorry guys, we just got through E3 so recovering :)

Avatar image for yobuddy
yobuddy

2

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

It is so much better right now. Thank you for upping the limit.

Avatar image for cbandes
cbandes

20

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44  Edited By cbandes

@jslack: Thanks for that answer. I just wanted to be sure that caching/preprocessing wasn't going to violate your don't redistribute rule. My app is really just for fun and I don't think more than a handful of people have even looked at it, but I still wanted to make sure I was being respectful of your rules. I appreciate the 'don't be evil' idea :)

Avatar image for leperwdup
leperwdup

4

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

Thanks for upping the limit. Much better, I would like more but I'm hard to please. Either way its a huge improvement, thanks.

Avatar image for jslack
jslack

143

Forum Posts

29824

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#46 jslack  Moderator

@yobuddy@cbandes@leperwdup Glad to hear. There will be more changes coming. The goal is to get the limit as high as technically possible.

Avatar image for matthewlupo
matthewlupo

29

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jslack Thanks for upping the limit, much appreciated for working with us to get the limits up. Glad to hear that things are being worked on and you guys are looking to make the limits as high as possible. :)

Avatar image for stufff11
stufff11

1

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'd like to second the request for a significantly higher limit for paid users. I just heard about ComicRack and ComicVine today and am going through the process of organizing my rather large collection. After I hit the limit for the first time I poked around for a bit, looking for a way to pay to bypass it, and was sad when I couldn't find anything. I'd happily pay for the privilege of more API hits.

Avatar image for cbanack
cbanack

124

Forum Posts

199

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@stufff11 said:

I'd like to second the request for a significantly higher limit for paid users. I just heard about ComicRack and ComicVine today and am going through the process of organizing my rather large collection. After I hit the limit for the first time I poked around for a bit, looking for a way to pay to bypass it, and was sad when I couldn't find anything. I'd happily pay for the privilege of more API hits.

@stufff11: The SCRAPE_DELAY setting in Comic Vine Scraper may be of some use to you.

Avatar image for jczorkmid
jczorkmid

1

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50  Edited By jczorkmid
@jslack said:

I have been hesitant to provide exact data in the error message, such as: "You've exceeded the 200 request / per 15 min limit, you have 11:59 remaining on your cooldown", but maybe that's not so bad, and could be helpful for you? Let me know what we can do to help you on that, so it's easier for your app to know when to stop requests, and when to retry.

I'm a bit late to this party, but I haven't been actively doing much with the API recently, then I hit this yesterday. Usually I've found rate limit info to be available through the API. For example, twitter sets extra HTTP headers in their responses:

  • X-RateLimit-Limit: 350
  • X-RateLimit-Remaining: 350
  • X-RateLimit-Reset: 1277485629

You can also explicity request the limits (say, on startup), and get a response like this:

{
"remaining_hits": 150,
"reset_time_in_seconds": 1319138031,
"hourly_limit": 150,
"reset_time": "Thu Oct 20 19:13:51 +0000 2011"
}

I have rate limiting code, but it needs to know what the rate is to be useful. Something like this would be a great help to me at least.

Update:

Additionally, I notice that all API requests seem to return "Cache-Control: no-cache" (even '/types'). It might help to send something more reasonable. Either something based on the type of resource requested, or even 12-24 hours across the board.