
Largescale networks are now used by nearly a billion users and form an indispensable part of modern society. Such large networks include the web structures of the Internet, and social networks like Facebook and Twitter. All these networks are expanding rapidly and are expected to grow to a scale of more than 10 billion users in the near future.
The growth in the volume of information on these expanding networks exceeds the speed of hardware evolution. Current data algorithms will soon be unable to process the volume of information for so called “big data” at a practical speed, particularly for a network on the scale of 10 billion users (10^{10}class network). Thus, development of highspeed data processing algorithms is an urgent task.
With this backdrop, the Kawarabayashi Large Graph Project aims to develop such highspeed data processing algorithms. In this project, we understand a largescale network to mean a structure with nodes connected by edges; in other words, we regard the overall network as a massive graph with more than 10 billion nodes. Based on this, the project endeavors to establish algorithms by analyzing this massive graph using cuttingedge mathematical approaches, including theoretical computer science and discrete mathematics.