The 1st International Workshop on

Methods, Models, and Resources for Geospatial Knowledge Graphs and GeoAI

co-located with GIScience 2020, Poznań, Poland

Call For Paper

The rapid increase in high-quality data, advanced machine learning algorithms, and the availability of fast hardware have largely contributed to a renewed interest in Artificial Intelligence (AI). Despite many successful stories in computer vision, natural language processing, and speech recognition, there are many challenges that remain to be solved, such as large scale neural symbolic reasoning based on unstructured text and automatic knowledge graph construction. Interestingly, nowadays, one of the most prominent topics in AI is the combination of representation learning techniques (Connectionist Artificial Intelligence) with symbolic representation and reasoning associated with knowledge graphs (Symbolic Artificial Intelligence), in order to develop scalable and interpretable machine learning models. One good example is knowledge graph embedding models that aim at representing components of knowledge graphs, such as entities and relations, as continuous vectors or matrices while preserving the graph’s structural information. From a geospatial point-of-view, GeoAI, as an interdisciplinary field of GIScience and AI, advocates the idea of developing and utilizing AI techniques in geography and earth science (Couclelis, 1986; Openshaw et al., 1997; Janowicz et al., 2019). Geospatial knowledge graphs, as symbolic representations of geospatial knowledge, go to the core of GeoAI and facilitate many intelligent applications such as geospatial data integration and knowledge discovery. In fact, geospatial data plays an important role in the Linked Open Data cloud, an open-sourced cross-domain knowledge graph, since spatio-temporal scopes are essential for describing events, people, and objects. However, many relational machine learning models treat geographic entities as ordinary entities in which the spatial footprints of places are neglected and the distance decay effect is ignored. This results in suboptimal performance in many geospatial related tasks such as geospatial knowledge graph completion, geographic question answering, geographic entity alignment, as well as geographic knowledge graph summarization.

In addition, there exist many demands for further advancements in other research topics related to GeoAI, such as remote sensing and street view image analysis, transportation modeling, and geo-text analysis. There are, for instance, many challenges in the adaptation of deep learning techniques to these scenarios, including the limited availability of labeled data or the difficulty of the models to generalize between locations. Incorporating geo-spatial knowledge (i.e., prior knowledge about the structure of objects on the surface of the Earth, and about the fundamental rules of geography) directly into deep neural network models, in the form of specially designed components and/or regularization schemes, is a promising approach to address the aforementioned challenges.

Based on the above observations, this combined workshop and tutorial emphasizes the importance of geospatial information and principles in designing, developing, and utilizing geospatial knowledge graphs and other GeoAI techniques. Accordingly, we call for new methods, models, and resources for advancing research related to Geospatial Knowledge Graphs and GeoAI.

List of Relevent Topics

  • Deep Learning and Reinforcement Learning on Geospatial Knowledge Graphs
    • Geographic Knowledge Graph Embeddings
    • Geographic Question Answering and Semantic Parsing based on Knowledge Graphs
    • Geospatial Knowledge Graph Summarization
  • Geo-Ontology Engineering and Geospatial Knowledge Graph Construction
    • Spatio-Temporal Scoping of Knowledge Graphs
    • Gazetteer Data Management
    • Coreference Resolution for Geographic Entities
    • Geographic Ontology Alignment
    • Geospatial Knowledge Graph Construction and Completion
    • Geographic Entity Similarity Measurement
  • Querying and Visualization on Geospatial Knowledge Graphs
    • GeoSPARQL and Spatial Query Evaluation
    • Knowledge Graph Visualization
    • Geo-Ontology Visualization
  • Geographic Information Retrieval and Geo-Text Analysis
    • Text Geoparsing, Toponym Recognition, and Toponym Resolution
    • Information Extraction from Location-Based Social Media
    • Searching and Indexing Texts based on Locations
    • Open Domain Geographic Question Answering
    • Human Experience Extraction from Place Descriptions
  • Spatially Explicit Machine Learning Methods and Models for GeoAI
    • Bridging GIScience Methods with Deep Learning
    • Model Invariance/Equivariance for Geospatial Applications (e.g., equivariance to changes in input scale or rotation)
  • GeoAI for Geospatial Image Analysis
    • Classification, Segmentation, and Object/Instance Recognition
    • Remote Sensing Images
    • Street View Images
    • Scanned Paper Maps and Historical Imagery
  • GeoAI Resources and Infrastructures
    • Data Augmentation Strategies and Dataset Generation
    • Development of benchmark Datasets, Tools, and Platforms
    • Spatial Data Infrastructures Supporting GeoAI
  • Other GeoAI Topics and Applications
    • Transportation Modeling and Trajectory Data Analysis
    • Spatial Optimization
    • Spatio-Temporal Data Fusion and Assimilation
    • Spatial Simulation (i.e. Learning Agents in Agent-based Simulations)

Important Dates

  • Paper submission: June 30, 2020
  • Acceptance decision: July 30, 2020
  • Camera ready version: August 15, 2020
  • Workshop date: September 15, 2020

Workshop Format

This workshop will have a half-day for tutorial sessions and a half-day for research presentations. We welcome short research articles and industry demonstration papers of relevant topics (4 pages).

Workshop Proceedings

All presented papers will be made available through CEUR-WS proceedings contingent upon authors' agreement. We will also consider a special issue with a journal.

Organizers

Contact

For any further information, please contact Gengchen Mai or Ling Cai.