khajlk
I am trying to play with GNN in Python. Instead of using shapefiles, I am using randomly generated sample data. Here is the code:
```
from shapely.geometry import Point
import random
import matplotlib.pyplot as plt
import networkx as nx
import torch
from torch_geometric.data import Data
# Define the number of points in each set
num_points_set1 = 100
num_points_set2 = 50
# Define the bounding box for the points
min_x, min_y, max_x, max_y = (0, 0, 10, 10)
# Create a list to hold the points in set 1
data_set_1 = []
# Generate random points for set 1
for i in range(num_points_set1):
x = random.uniform(min_x, max_x)
y = random.uniform(min_y, max_y)
data_set_1.append(Point(x, y))
# Create a list to hold the points in set 2
data_set_2 = []
# Generate random points for set 2
for i in range(num_points_set2):
x = random.uniform(min_x, max_x)
y = random.uniform(min_y, max_y)
data_set_2.append(Point(x, y))
# Nodes and edges
# Concatenate two data sets
data_set = data_set_1 + data_set_2
# Define the parameters alpha and beta
alpha = 0.4
beta = 0.1
9
# Create the graph
G = nx.waxman_graph(data_set, alpha=alpha, beta=beta)
# Create the graph
edge_index = torch.tensor(list(G.edges()), dtype=torch.long)
x = torch.tensor([point.x for point in data_set_1 + data_set_2], dtype=torch.float)``
```
I am getting the follwing error when I run the second last line, `torch.tensor()`. How can I fix the error?
```
TypeError Traceback (most recent call last)
1 # Create the graph
2 edge_index = torch.tensor(list(G.edges()), dtype=torch.long)
3 x = torch.tensor([point.x for point in data_set_1 + data_set_2], dtype=torch.float)
4
5 # Create the data object
TypeError: 'Point' object cannot be interpreted as an integer`
```