setFetchBatchSize doesn't seem to work properly

I ask a few questions on this topic but still can’t get it to work. I have a core data with 10k+ rows of people names that i am showing in a tableview. I would like to be able to search and update the table with every letter. It’s very laggy. As suggested i watch the WWWDC ’10 core data presentation and was trying to implement

[request setFetchBatchSize:50];

Doesn’t seem to work. When i use instruments to check core data it still shows that there is still 10k request when loading the tableview and when i search it also gets all the results.
Is there anything else that needs to be done to set the batch size or thats not something that will help me.

  • Auto Layout Constraint With Dynamic Variable?
  • XMPPFramework - Implement Group Chat (MUC)
  • Can't upload archive to app store since yesterday
  • Cocoapods Completely Broke
  • Shader optimization for retina screen on iOS
  • Calling a New View when selecting a Row in a 'UITableView'
  • The only thing that seems to work is setting the fetchlimit to 100 when i search. Do you think its a good solution?

    Thanks in advance!

    Solutions Collect From Internet About “setFetchBatchSize doesn't seem to work properly”

    The batch size just tells it how many objects to fetch at a time. This is probably not going to help you very much. Let’s consider your use case a bit…

    The user types “F” and you tell the database, “Go find all the names that start with ‘F'” and the database looks at all 10k+ records to find the ones that start with ‘F’

    Then, the user types ‘r’, so you tell the database to go find all the records that start with “Fr” and it again looks at all 10k+ records to find the ones that start with “Fr.”

    All fetchBatchSize is doing is telling it “Hey, when you fault in a record, bring in 50 at once because I’m going to probably need all those anyway.” That does nothing to limit your search.

    However, setting fetchLimit to 100 helps some because the database starts hunting through all 10k+ records, but once it has its 100 records, it does not have to keep looking at the rest of the records because it already has filled its request. It’s done, and stops searching as soon as it gets 100 records that satisfy the request.

    So, there are several things you can do, all depending on your other use cases.

    The easiest thing to try is adding an index on the field that you are searching. You can set that in the Xcode model editor (section that says Indexes, right under where you can name the entity in the inspector). This will allow the database to setup a special index on that field, and searching will be much faster.

    Second, after your initial request, you already have an array of names that begin with ‘F’ so there is no need to go back to the database to ask for names that begin with ‘Fr’ If it begins with ‘Fr’ it also begins with ‘F’ and you already have NSManagedObject pointers for all of those. Now, you can just search the array you got back.

    Even better, if you gave it a sort descriptor, the array is sorted. Thus, you can do a simple binary search on the array. Or, if you prefer, you can just use the same predicate, and apply it to the results array instead of the database.

    Even if you don’t use the results-pruning I just discussed, I think indexing the attribute will belt dramatically.


    Maybe you should run instruments to see how much time you are spending where. Also, a badly formed predicate can bring any index scheme to it knees. Code would help.

    Finally, consider how many elements you are bringing into memory. CoreData does not fault all the information in, but it does create shells for everything in the array.

    If you give it a sort predicate,

    I don’t know how SQLLite implements its search on an index, but a B-Tree has complexity logBN so even on 30k records, that’s not a lot of searching. Unless you have another problem, the indexing should have given you a pretty big improvement.

    Once you have the index, you should not be examining all records. However, you still may have a very large set of data. Try fetchBatchSize on those, because it will limit the number of records fetched, and create proxies for the rest.

    You can also call countFetchRequext instead of executeFetchRequest to get the number of items. Then, you can use fetchLimit to restrict the number you get.

    As far as working all this with a fetched results controller… well, that guy has to know the records, so it still has to do the search.

    Also, one place to look… are you doing sections? If you have a user defined comparator for anything (like translating for sections) this will get called for every single record.

    Thus, I guess my big suggestion, after making the index change, is to run instruments and really study it to determine where you are spending your time. It should be pretty obvious. That will help steer you toward the real issue.

    My bet is that you are still access all of the elements for some reason…