ATTENTION#

Classes#

class models.dualprompt_utils.attention.PreT_Attention(dim, num_heads=8, qkv_bias=False, attn_drop=0.0, proj_drop=0.0)[source]#

Bases: Module

forward(x, prompt=None)[source]#