Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about the Aggregator, concatenate two layers' information, please help me #102

Open
wu33learn opened this issue May 18, 2024 · 2 comments
Open

Comments

@wu33learn
Copy link

the Preprocessing output: (B784, 2, 1024), it is calculated below
torch.stack( [ (B
784, 1024), (B784,1024) ], dim=1 ) , where (B784, 1024) and (B*784,1024) represent the output of MeanMapper for layer2 and layer3

and in the Aggregator, after a reshape to (B784, 1, 2048), it uses a adaptive_average_pool1d(target_dim=1024) to get an output of (B784, 1, 1024), is that to calculate average about layer2 and layer3 at same dimension, that cannot reach this performance.
a example [1,2,3,4,5], [1,1,1,1,1] stack to [ [1,2,3,4,5],[1,1,1,1,1] ] reshape to [[1,2,3,4,5,1,1,1,1,1]] and adaptive_average_pool1d, that is computed in line, (1+2)/2=1.5 (3+4)/2, not to compute (1+1)/2, (2+1)/2, (3+1)/2, the same place at each. maybe not good example

mayebe i am wrong, please tell me, thank you for your help!

@wu33learn
Copy link
Author

help me,please

@wu33learn
Copy link
Author

代码好难懂啊,看晕了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant