Skip to content

Average Inference Time per Sentence #65

@jwolley9701gh

Description

@jwolley9701gh

Hi, I was hoping to use the AITS to evaluate efficiency of another model, as part of a project. I could only find the implementation of it in demo.py; is MLD Infer time - This/Ave batch: {infer_time/num_batch:.2f} the one used to report AITS as mentioned in the paper? I was also unable to find the supplements to the published paper when trying to look for more explanation about how AITS is calculated, is there a link to it? Thanks!

        print(f'MLD Infer time - This/Ave batch: {infer_time/num_batch:.2f}')
        print(f'MLD Infer FPS - Total batch: {num_all_frame/infer_time:.2f}')
        print(f'MLD Infer time - This/Ave batch: {infer_time/num_batch:.2f}')
        print(f'MLD Infer FPS - Total batch: {num_all_frame/infer_time:.2f}')
        print(
            f'MLD Infer FPS - Running Poses Per Second: {num_ave_frame*infer_time/num_batch:.2f}')
        print(
            f'MLD Infer FPS - {num_all_frame/infer_time:.2f}s')
        print(
            f'MLD Infer FPS - Running Poses Per Second: {num_ave_frame*infer_time/num_batch:.2f}')

        # todo no num_batch!!!
        # num_batch=> num_forward
        print(
            f'MLD Infer FPS - time for 100 Poses: {infer_time/(num_batch*num_ave_frame)*100:.2f}'
        )
        print(
            f'Total time spent: {total_time:.2f} seconds (including model loading time and exporting time).'
        )

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions