6/29/2023

Graph Neural Network Study Tutorial

 

Stanford CS224W Tutorials

https://data.pyg.org/img/cs224w_tutorials.png

The  Stanford CS224W course has collected a set of graph machine learning tutorial blog posts, fully realized with . Students worked on projects spanning all kinds of tasks, model architectures and applications. All tutorials also link to a  with the code in the tutorial for you to follow along with as you read it!

PyTorch Geometric Tutorial Project

The  PyTorch Geometric Tutorial project provides video tutorials and  Colab notebooks for a variety of different methods in :

  1. Introduction [ YouTube Colab]

  2.  basics [ YouTube Colab]

  3. Graph Attention Networks (GATs) [ YouTube Colab]

  4. Spectral Graph Convolutional Layers [ YouTube Colab]

  5. Aggregation Functions in GNNs [ YouTube Colab]

  6. (Variational) Graph Autoencoders (GAE and VGAE) [ YouTube Colab]

  7. Adversarially Regularized Graph Autoencoders (ARGA and ARGVA) [ YouTube Colab]

  8. Graph Generation [ YouTube]

  9. Recurrent Graph Neural Networks [ YouTube Colab (Part 1) Colab (Part 2)]

  10. DeepWalk and Node2Vec [ YouTube (Theory) YouTube (Practice) Colab]

  11. Edge analysis [ YouTube Colab (Link Prediction) Colab (Label Prediction)]

  12. Data handling in  (Part 1) [ YouTube Colab]

  13. Data handling in  (Part 2) [ YouTube Colab]

  14. MetaPath2vec [ YouTube Colab]

  15. Graph pooling (DiffPool) [ YouTube Colab]

text summarise dataset

**Paper:**

https://arxiv.org/abs/1908.08345


**Dataset:**

1) the CNN/DailyMail news highlights dataset: somewhat Extractive

- News Articles & Related Highlights: Provides a brief overview of articles

- Input document: limited to 512 tokens

- https://www.kaggle.com/datasets/gowrishankarp/newspaper-text-summarization-cnn-dailymail


2) the New York Times Annotated Corpus (NYT): somewhat Extractive

- Contains 110,540 articles with abstract summaries

- Input document : limited to 800 tokens

- https://research.google/resources/datasets/ny-times-annotated-corpus/


3) XSum: Abstractive

- 226,711 news articles answering the question of ‘What is this articles about?’ + one-sentence summaries

- Input document: limited to 512 tokens

- https://github.com/google-research-datasets/xsum_hallucination_annotations

6/10/2023

All Readings: Introduction to Generative AI (G-GENAI-I)

All Readings: Introduction to Generative AI (G-GENAI-I)
Here are the assembled readings on generative AI:


●  Ask a Techspert: What is generative AI? https://blog.google/inside-google/googlers/ask-a-techspert/what-is-generative-ai/


●  Build new generative AI powered search & conversational experiences with Gen App Builder:
https://cloud.google.com/blog/products/ai-machine-learning/create-generative-apps-in-
minutes-with-gen-app-builder


●  What is generative AI? https://www.mckinsey.com/featured-insights/mckinsey-explainers/what-is-generative-ai


●  Google Research, 2022 & beyond: Generative models: https://ai.googleblog.com/2023/01/google-research-2022-beyond-language.html#Gener ativeModels


●  Building the most open and innovative AI ecosystem: https://cloud.google.com/blog/products/ai-machine-learning/building-an-open-generativ e-ai-partner-ecosystem


●  Generative AI is here. Who Should Control It? https://www.nytimes.com/2022/10/21/podcasts/hard-fork-generative-artificial-intelligen ce.html


●  Stanford U & Google’s Generative Agents Produce Believable Proxies of Human Behaviors:
https://syncedreview.com/2023/04/12/stanford-u-googles-generative-agents-produce-b
elievable-proxies-of-human-behaviours/


●  Generative AI: Perspectives from Stanford HAI: https://hai.stanford.edu/sites/default/files/2023-03/Generative_AI_HAI_Perspectives.pd f


●  Generative AI at Work: https://www.nber.org/system/files/working_papers/w31161/w31161.pdf


●  The future of generative AI is niche, not generalized: https://www.technologyreview.com/2023/04/27/1072102/the-future-of-generative-ai-is- niche-not-generalized/
Here are the assembled readings on large language models:


●  NLP's ImageNet moment has arrived: https://thegradient.pub/nlp-imagenet/


●  Google Cloud supercharges NLP with large language models:
https://cloud.google.com/blog/products/ai-machine-learning/google-cloud-supercharge
s-nlp-with-large-language-models


●  LaMDA: our breakthrough conversation technology: https://blog.google/technology/ai/lamda/

6/02/2023

torch tensor padding example code:

 refer to code:


.

import torch
import torch.nn.functional as F

tensor = torch.randn(2, 3, 4) # Original tensor
print("Original tensor shape:", tensor.shape)

# Case 1: Pad the last dimension (dimension -1) -> resulting shape: [2, 3, 8]
padding_size = 4
padded_tensor = F.pad(tensor, (padding_size, 0)) # Add padding to the left of the last dimension
print("Case 1 tensor shape:", padded_tensor.shape)

# Case 2: Pad the second-to-last dimension (dimension -2) -> resulting shape: [2, 8, 4]
padding_size = 5
padded_tensor = F.pad(tensor, (0, 0, padding_size, 0)) # Add padding to the left of the second-to-last dimension
print("Case 2 tensor shape:", padded_tensor.shape)

# Case 3: Pad the first dimension (dimension 0) -> resulting shape: [7, 3, 4]
padding_size = 5
padded_tensor = F.pad(tensor, (0, 0, 0, 0, padding_size, 0)) # Add padding to the left of the first dimension
print("Case 3 tensor shape:", padded_tensor.shape)

..


www.marearts.com

Thank you. πŸ™‡πŸ»‍♂️