Transferrable DP-Adapter Tuning: A Privacy-Preserving Multimodal Parameter-Efficient Fine-Tuning Framework | IEEE Conference Publication | IEEE Xplore

Transferrable DP-Adapter Tuning: A Privacy-Preserving Multimodal Parameter-Efficient Fine-Tuning Framework


Abstract:

In recent years, multimodal large-scale pre-trained models have achieved tremendous success and become a milestone in the field of artificial intelligence, demonstrating ...Show More

Abstract:

In recent years, multimodal large-scale pre-trained models have achieved tremendous success and become a milestone in the field of artificial intelligence, demonstrating the effectiveness of the pre-training and fine-tuning paradigm in the multimodal domain. Thus, multimodal pre-trained models have been widely applied in various fields of daily life, including some privacy-sensitive areas such as medical diagnosis, financial analysis, public safety, and social media management. Fine-tuning multimodal pre-trained models to adapt to specific tasks in these fields often requires first acquiring data from these downstream tasks and then conducting fine-tuning training. During this process, sensitive private information may be inadvertently learned and leaked. Therefore, we propose a privacy-preserving multimodal parameter-efficient fine-tuning framework: Transferrable DP-Adapter Tuning (TDPAT). In this framework, the model owner sends a lightweight adapter and a lossy compression emulator to the data owner, who then fine-tunes the adapter on downstream data with the help of the emulator. During the fine-tuning training of the adapter, DP-SGD (Differentially Private Stochastic Gradient Descent) improved based on the ideas of multimodal contrastive learning, is incorporated to achieve differential privacy protection. The fine-tuned adapter is then returned to the model owner, who plugs it into the entire multimodal pre-trained model. The TDPAT framework simultaneously achieves privacy protection in three aspects: data security, model security, and inference security. Moreover, by utilizing the multimodal parameter-efficient method Adapter, it reduces the fine-tuning costs and improves fine-tuning efficiency while ensuring fine-tuning performance.
Date of Conference: 01-05 July 2024
Date Added to IEEE Xplore: 26 September 2024
ISBN Information:

ISSN Information:

Conference Location: Cambridge, United Kingdom

Contact IEEE to Subscribe

References

References is not available for this document.