Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can these modules be added to the pre-trained model? #1

Open
Daydaylight opened this issue Jul 7, 2023 · 2 comments
Open

Can these modules be added to the pre-trained model? #1

Daydaylight opened this issue Jul 7, 2023 · 2 comments

Comments

@Daydaylight
Copy link

Hello, thank you for your contribution to show me such detailed knowledge.
I have a question, most of the researchers are using pre-trained models extracted features are turned into one dimension, where should these attention modules be added in this case? If added to the pre-trained model will it destroy the original structure?
I feel your view and look forward to your reply.

@changzy00
Copy link
Owner

Hi, thank you for your question. One way is that you can insert the attention module into a new block before the global average pooling layer of the pre-trained model. This way, the original structure of the pre-trained model will not be destroyed. Adding an attention module can help to improve the performance of the model by allowing it to focus on specific parts of the input data that are more relevant for the task at hand.

@Daydaylight
Copy link
Author

Hi, thank you for your question. One way is that you can insert the attention module into a new block before the global average pooling layer of the pre-trained model. This way, the original structure of the pre-trained model will not be destroyed. Adding an attention module can help to improve the performance of the model by allowing it to focus on specific parts of the input data that are more relevant for the task at hand.

Thanks for your answer, I think it is a good idea and I will try it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants