Jump to content

Remus Avram

Members
  • Posts

    493
  • Joined

  • Days Won

    13

Posts posted by Remus Avram

  1. 2 minutes ago, Nebukadhezer said:

    we also use one session across threads and I am constantly running into the same behavior.
    As a workaround I am setting the auto_populate attribute to True at many places as this seems to be setting itself to False when the same session object is used from multiple threads...

    We tried multiple ways. It will not work.

    If you are using multi threads, use multiple session.

  2. Thanks @Martin Pengelly-Phillips for the info!

    So as I understand the cache is build per session. If there is a session per thread, then for each thread there is a cache file which can contain the same data as the other sessions. Am I correct?

    Having only one session, there is only one cache file with all the data and the queries are faster. Less queries to the database.

    Do you know if the sessions connected?

    I did a test and it seems that they are. I query in assetBuild in one session and I created a task using as parent the assetbuild from the other session and the task was created.

    @Martin Pengelly-Phillips: I am interested how are you using the session. Are you creating a new session for each query / commit?

  3. Thanks @Mattias Lagergren for your answer!

    For us it's quite important because we are planing to use threads in all of our tools. We would like to use at least 1 thread in order to not freeze the UI while it is fetching the data.

    2 minutes ago, Mattias Lagergren said:

    One possible way to move forward with this is to change the example above and pass in the project id to the threads and then query for TypedContext where project_id is <project_id> in the thread.

    You mean something like this:

    from multiprocessing.dummy import Pool as ThreadPool
    
    import ftrack_api
    from ftrack_api.symbol import Symbol
    
    
    session = ftrack_api.Session()
    
    
    def check_keys(entity):
        for key in entity.keys():
            if isinstance(entity[key], Symbol):
                print entity, ': ', key
    
    
    def check_children(entity_id):
        entity = session.get('TypedContext', entity_id)
        if 'children' in entity.keys():
            for child in entity['children']:
                check_keys(entity=child)
                check_children(entity_id=child['id'])
    
    
    def main():
        projects = session.query("Project").all()
        projects_id = [project["id"] for project in projects]
        pool = ThreadPool()
        pool.map(check_children, projects_id)
    
    
    if __name__ == "__main__":
        main()

    It still doesn't work. In the thread most of the time session.get('TypedContext', entity_id) returns None.

  4. Hi Mattias,

    ahh... are you going to make it thread safe?

    The problem is that if it is not specified when the session is created to not auto-populate, then it should never return a Symbol (NOT SET) value.

    Creating a session per thread works as expected. But it doesn't help us as the sessions are not connected.

  5. Hi Ftrack Team,

    we would like to use the ftrack session in threads, but, unfortunately, it seems that ftrack_api is not thread safe.

    When we are using the session in multiple threads the return of the attribute value of the entities is a Symbol (NOT SET).

    Please find below a script where we were able to reproduce the issue:

    from multiprocessing.dummy import Pool as ThreadPool
    
    import ftrack_api
    from ftrack_api.symbol import Symbol
    
    
    session = ftrack_api.Session()
    
    
    def check_keys(entity):
        for key in entity.keys():
            if isinstance(entity[key], Symbol):
                print entity, ': ', key
    
    
    def check_children(entity):
        if 'children' in entity.keys():
            for child in entity['children']:
                check_keys(entity=child)
                check_children(entity=child)
    
    
    def main():
        projects = session.query("Project").all()
        pool = ThreadPool()
        pool.map(check_children, projects)
    
    
    if __name__ == "__main__":
        main()

    ftrack_api version: 1.3.2

    ftrack server version: 3.5.6

     

  6. Hi all,

    is it possible to send custom notifications via ftrack_api?

    We are automating a lot with actions. But this creates also a lot of confusion. We need to communicate better to the users. We are using the pop up messages, but these are displayed only for 5 seconds. Sometimes the user is missing them, sometimes the message is going away before he/she finishes reading it.

    Sending notifications to the user will help a lot in understanding what is happening.

  7. Hi Ftrack Team,

    we would like a way to subscribe/unsubscribe (or follow/unfollow) a Ftrack object (like shot/asset/etc).

    Also we would like to be able to subscribe/unsubscribe other users to it.

    The current issue which we have right not in the production is that we don't have a way for the coordinators to split the shots between them and follow the activity on specific shots/asset builds/etc.

     

  8. Hello,

     

     

    System Settings -> Workflow -> Types - in this moment isn't any way of sorting;

    If there are more then 50 types, it's very difficult to find what you are looking.

    It wolud be great if we could sort after Color or Name.

     

    Also System Settings -> Workflow -> Status Types - in this moment isn't any way of sorting;

    It wolud be great if we could sort after Color, Name or Corresponds to.

     

     

    Thanks,

    Remus

×
×
  • Create New...