Skip to content

Add torch_geometric.nn.attention to docs #10089

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 13 commits into
base: master
Choose a base branch
from

Conversation

xnuohz
Copy link
Contributor

@xnuohz xnuohz commented Mar 1, 2025

image

Copy link
Member

@akihironitta akihironitta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! Could we have a similar list to the list of conv layers?

Convolutional Layers
--------------------

.. currentmodule:: torch_geometric.nn.conv

.. autosummary::
   :nosignatures:
   :toctree: ../generated
   :template: autosummary/nn.rst

   {% for name in torch_geometric.nn.conv.classes %}
     {{ name }}
   {% endfor %}

@xnuohz
Copy link
Contributor Author

xnuohz commented Mar 1, 2025

there are too many conv modules, if list all the details, i think searching will be inconvenient.

Copy link
Member

@akihironitta akihironitta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wdym? nn.attention has 4 layers:

$ git grep "class " torch_geometric/nn/attention/
torch_geometric/nn/attention/performer.py:class PerformerProjection(torch.nn.Module):
torch_geometric/nn/attention/performer.py:class PerformerAttention(torch.nn.Module):
torch_geometric/nn/attention/qformer.py:class QFormer(torch.nn.Module):
torch_geometric/nn/attention/sgformer.py:class SGFormerAttention(torch.nn.Module):

@xnuohz
Copy link
Contributor Author

xnuohz commented Mar 1, 2025

misread it to expand all the conv modules haha

Copy link

codecov bot commented Mar 1, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 85.43%. Comparing base (c211214) to head (766eeb9).
Report is 29 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master   #10089      +/-   ##
==========================================
- Coverage   86.11%   85.43%   -0.68%     
==========================================
  Files         496      496              
  Lines       33655    34003     +348     
==========================================
+ Hits        28981    29051      +70     
- Misses       4674     4952     +278     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@xnuohz
Copy link
Contributor Author

xnuohz commented Mar 4, 2025

@akihironitta fixed webinar url and added docstring for llm wrappers, lmk if anything else to merge

@xnuohz xnuohz requested a review from akihironitta May 6, 2025 16:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants