FFmpeg
Data Structures | Macros | Functions
dnn_backend_common.h File Reference
#include "queue.h"
#include "../dnn_interface.h"
#include "libavutil/thread.h"

Go to the source code of this file.

Data Structures

struct   TaskItem
 
struct   LastLevelTaskItem
 
struct   DNNAsyncExecModule
  Common Async Execution Mechanism for the DNN Backends. More...
 

Macros

 

Functions

int  ff_check_exec_params (void *ctx, DNNBackendType backend, DNNFunctionType func_type, DNNExecBaseParams *exec_params)
 
int  ff_dnn_fill_task (TaskItem *task, DNNExecBaseParams *exec_params, void *backend_model, int async, int do_ioproc)
  Fill the Task for Backend Execution. More...
 
  Join the Async Execution thread and set module pointers to NULL. More...
 
  Start asynchronous inference routine for the TensorFlow model on a detached thread. More...
 
  Extract input and output frame from the Task Queue after asynchronous inference. More...
 
int  ff_dnn_fill_gettingoutput_task (TaskItem *task, DNNExecBaseParams *exec_params, void *backend_model, int input_height, int input_width, void *ctx)
  Allocate input and output frames and fill the Task with execution parameters. More...
 

Detailed Description

DNN common functions different backends.

Definition in file dnn_backend_common.h.

Macro Definition Documentation

DNN_BACKEND_COMMON_OPTIONS

#define DNN_BACKEND_COMMON_OPTIONS
Value:
{ "nireq", "number of request", OFFSET(options.nireq), AV_OPT_TYPE_INT, { .i64 = 0 }, 0, INT_MAX, FLAGS }, \
{ "async", "use DNN async inference", OFFSET(options.async), AV_OPT_TYPE_BOOL, { .i64 = 1 }, 0, 1, FLAGS },

Definition at line 31 of file dnn_backend_common.h.

Function Documentation

ff_check_exec_params()

int ff_check_exec_params ( void *  ctx,
DNNBackendType  backend,
DNNFunctionType  func_type,
DNNExecBaseParamsexec_params 
)

Definition at line 29 of file dnn_backend_common.c.

Referenced by ff_dnn_execute_model_native(), ff_dnn_execute_model_ov(), and ff_dnn_execute_model_tf().

ff_dnn_fill_task()

int ff_dnn_fill_task ( TaskItemtask,
DNNExecBaseParamsexec_params,
void *  backend_model,
int  async,
int  do_ioproc 
)

Fill the Task for Backend Execution.

It should be called after checking execution parameters using ff_check_exec_params.

Parameters
task pointer to the allocated task
exec_param pointer to execution parameters
backend_model void pointer to the backend model
async flag for async execution. Must be 0 or 1
do_ioproc flag for IO processing. Must be 0 or 1
Returns
0 if successful or error code otherwise.

Definition at line 56 of file dnn_backend_common.c.

Referenced by ff_dnn_execute_model_native(), ff_dnn_execute_model_ov(), ff_dnn_execute_model_tf(), and ff_dnn_fill_gettingoutput_task().

ff_dnn_async_module_cleanup()

int ff_dnn_async_module_cleanup ( DNNAsyncExecModuleasync_module )

Join the Async Execution thread and set module pointers to NULL.

Parameters
async_module pointer to DNNAsyncExecModule module
Returns
0 if successful or error code otherwise.

Definition at line 92 of file dnn_backend_common.c.

Referenced by destroy_request_item().

ff_dnn_start_inference_async()

int ff_dnn_start_inference_async ( void *  ctx,
DNNAsyncExecModuleasync_module 
)

Start asynchronous inference routine for the TensorFlow model on a detached thread.

It calls the completion callback after the inference completes. Completion callback and inference function must be set before calling this function.

If POSIX threads aren't supported, the execution rolls back to synchronous mode, calling completion callback after inference.

Parameters
ctx pointer to the backend context
async_module pointer to DNNAsyncExecModule module
Returns
0 on the start of async inference or error code otherwise.

Definition at line 111 of file dnn_backend_common.c.

Referenced by execute_model_tf(), and ff_dnn_flush_tf().

ff_dnn_get_result_common()

DNNAsyncStatusType ff_dnn_get_result_common ( Queuetask_queue,
AVFrame **  in,
AVFrame **  out 
)

Extract input and output frame from the Task Queue after asynchronous inference.

Parameters
task_queue pointer to the task queue of the backend
in double pointer to the input frame
out double pointer to the output frame
Return values
DAST_EMPTY_QUEUE if task queue is empty
DAST_NOT_READY if inference not completed yet.
DAST_SUCCESS if result successfully extracted

Definition at line 142 of file dnn_backend_common.c.

Referenced by ff_dnn_get_result_native(), ff_dnn_get_result_ov(), and ff_dnn_get_result_tf().

ff_dnn_fill_gettingoutput_task()

int ff_dnn_fill_gettingoutput_task ( TaskItemtask,
DNNExecBaseParamsexec_params,
void *  backend_model,
int  input_height,
int  input_width,
void *  ctx 
)

Allocate input and output frames and fill the Task with execution parameters.

Parameters
task pointer to the allocated task
exec_params pointer to execution parameters
backend_model void pointer to the backend model
input_height height of input frame
input_width width of input frame
ctx pointer to the backend context
Returns
0 if successful or error code otherwise.

Definition at line 162 of file dnn_backend_common.c.

Referenced by get_output_native(), get_output_ov(), and get_output_tf().

FLAGS
#define FLAGS
Definition: cmdutils.c:509
options
const OptionDef options[]
OFFSET
it s the only field you need to keep assuming you have a context There is some magic you don t need to care about around this just let it vf default minimum maximum flags name is the option keep it simple and lowercase description are in without and describe what they for example set the foo of the bar offset is the offset of the field in your see the OFFSET() macro
AV_OPT_TYPE_INT
@ AV_OPT_TYPE_INT
Definition: opt.h:225
AV_OPT_TYPE_BOOL
@ AV_OPT_TYPE_BOOL
Definition: opt.h:244

Generated on Wed Aug 24 2022 21:42:39 for FFmpeg by   doxygen 1.8.17

AltStyle によって変換されたページ (->オリジナル) /