Forum Discussion

joetesta's avatar
joetesta
Roku Guru
10 years ago

any way to find orphaned objects?

Greetings,

I'm stuck on an issue and hoping for some help.
When our app launches, it makes an API call and gets data (currently playing programs). At this point hitting home button shows about 30K orphaned objects.

Now I'm trying to add a feature that allows the user to view more data (future scheduled programs) by making additional API calls. After a point, the roku freezes and reboots.
I assumed it was due to too much data stored in memory, so I'm testing this same feature by trying not to store any of the data; just making the API calls and it's still freezing and rebooting after 10 + API calls.
I'm trying to figure out why, and when exiting the app after a few such API calls, there are > 300K orphaned objects.
I've been tracing through the code and looking for anything that might be storing the data and building up these orphans but having no success.
Is there any way to see where these orphans are being stored? Are they necessarily in the global (m) array?

thanks in advance for any help with this,
Joe

15 Replies

  • "joetesta" wrote:
    thanks - I've been trying to set everything = invalid in the debugger, going up / down and using var, but even after setting everything I can see to invalid it brought the GC count from 67k down to about 64k

    It seems logical you haven't freed all objects. Not yet. If you did, the instances count will be (near) 0. But it is possible that a single object (e.g. giant XML, EPG, RAF) is holding all the others hostage. Walk the call stack with up/down, check list of vars, invalidate each and see how much GC changes as a way to pinpoint the memory hog. Don't forget looking inside "m" when at the top main() level (amusingly, that's "At bottom of context chain" in Roku parlance) - and NB: setting m=invalid will not free the global context, instead you'll have to invalidate each field inside (not that you can't shoot them with `for each k in m: m=invalid: next`).

    Oh, and one more thing - threads. If you are using the XML-graphic scenes, all my bets are off. Each of them keeps own variable contexts, don't they? How does that mush with GC?
  • Thanks -
    not using "XML-graphic scenes" 🙂
    I found the bulk of the objects show up after running ParseJSON on a ~3M API response. By deleting a bunch of those objects immediately after parse I've been able to trim the totals by 2/3.
    In the console when I delete large objects it doesn't seem to be having an impact. I'll delete a big object then bscs shows even more total...?
  • "joetesta" wrote:
    In the console when I delete large objects it doesn't seem to be having an impact. I'll delete a big object then bscs shows even more total...?

    Almost certainly somebody else is still pointing to said big objects. Consider if you were passing that object in function calls - or function stored it in local variable - that will keep it alive. As long as somebody - anybody - still points to an object, it's consider in use and retained.

    3MB JSON is hefty. Do you really need that much? Can't the info be chopped on smaller, as-needed pieces? Some hack i did couple of years comes to mind - viewtopic.php?f=34&t=73292 - but i doubt is needed here.
  • Thanks -
    ha, i already had your de-duper thread opened in another tab.
    We definitely don't need that much data, I just don't have control over the API so have to trim down what's there.
  • "joetesta" wrote:
    We definitely don't need that much data, I just don't have control over the API so have to trim down what's there.

    I see. Ha! This is interesting. Can you try using this right after parseJSON():

    function deflate(json, deduper=invalid):
    if deduper = invalid then deduper = { } : deduper.setModeCaseSensitive()

    if getInterface(json, "ifAssociativeArray") <> invalid:
    for each k in json:
    json[k] = deflate(json[k], deduper)
    end for
    elseif getInterface(json, "ifArray") <> invalid:
    for i = 0 to json.count() - 1:
    json[i] = deflate(json[i], deduper)
    end for
    elseif getInterface(json, "ifString") <> invalid:
    s = deduper[json]
    if s = invalid then deduper[json] = json : s = deduper[json]
    json = s
    end if

    return json
    end function

    To see what effect it may (or may not) have - like so, parse - print GC info - "deflate" - print GC info, something like:
    BrightScript Debugger> foo = parseJson(bar)
    BrightScript Debugger> ? runGarbageCollector(): foo = deflate(foo): ? runGarbageCollector()

    Can you try - and share the results?

    PS. i just tested it on a 6MB proper JSON, which sired about 103,000 instances. After deflating that number went down to 64,000 - so 39,000 strings were eliminated. Per `bscs`, remaining are 16,000 arrays, 6800 dictionaries and 42,000 strings. In other words 48% of the strings are gone as duplicates.