You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, thank you for your contribution to show me such detailed knowledge.
I have a question, most of the researchers are using pre-trained models extracted features are turned into one dimension, where should these attention modules be added in this case? If added to the pre-trained model will it destroy the original structure?
I feel your view and look forward to your reply.
The text was updated successfully, but these errors were encountered:
Hi, thank you for your question. One way is that you can insert the attention module into a new block before the global average pooling layer of the pre-trained model. This way, the original structure of the pre-trained model will not be destroyed. Adding an attention module can help to improve the performance of the model by allowing it to focus on specific parts of the input data that are more relevant for the task at hand.
Hi, thank you for your question. One way is that you can insert the attention module into a new block before the global average pooling layer of the pre-trained model. This way, the original structure of the pre-trained model will not be destroyed. Adding an attention module can help to improve the performance of the model by allowing it to focus on specific parts of the input data that are more relevant for the task at hand.
Thanks for your answer, I think it is a good idea and I will try it.
Hello, thank you for your contribution to show me such detailed knowledge.
I have a question, most of the researchers are using pre-trained models extracted features are turned into one dimension, where should these attention modules be added in this case? If added to the pre-trained model will it destroy the original structure?
I feel your view and look forward to your reply.
The text was updated successfully, but these errors were encountered: