A short bio

The group was started in April 2012 and combined research in the area of "Complex network analysis", "Algorithmic graph theory", and "Computer assisted cognitive science". The interdisciplinary education of Professor Zweig, the multi-disciplinary group consisting of mathematicians, physicists, and computer scientists, and our cooperation partners, be it from cancer research, ecology, economy, psychology, or law have added one overarching research interest to the group: namely Network Analysis Literacy (when to use which method to analyze networks). With the rise of big-data and automated decision making processes (based upon Artificial Intelligence and Data Science methods such as Complex Network Analysis), it became clear to Professor Zweig that her research should encompass Algorithmic Accountability. So, in 2017, the Graph Theory and Complex Network Analysis Group became the Algorthmic Accountability Lab.

We are always interested in new cooperations and look forward to your request (contact).

Algorithmic Accountability

Our societies are facing problems that are more and more complex so that decision making is often helped or even delegated to algorithms. Algorithmic decision making (ADM) processes are complex socio-technical systems which interacts with society on a large scale. Credit scoring, automatic job candidate selection, predictive policing, or recidivism risk assessment are examples, among others, of already used ADM.
The Algorithmic Accountability Lab,
  1. Analyzes and explains existing ADM processes for an interdisciplinary audience of lawyers, political scientists, media scientists, journalists, and the general public.
  2. Analyzes when ADM processes are necessary.
  3. Develops the measures to quantify the impact ADM has on society.
  4. Analyzes the mathematical properties and correlations between different methods from machine learning and their quality measures.
  5. Develops a formalized and structured approach to designs such systems in a transparent, ethic, and accountable way.
  6. and finally proposes didactic tools for mainstream public and decision makers concerning the risks and promises of ADM.
The methods used by the lab are at the intersection between natural, technical, and human sciences following the idea that such socio-technical systems can only be understood, predicted, and controlled within of their application context. See related projects.

Network Analysis Literacy

Network analysis literacy is concerned with when to use which method to analyze networks. The field has seen many methods being proposed in these areas - but not many of them have been evaluated with respect to some ground truth. We propose that the network analysis community should agree on benchmark data sets and ground truth or gold standard solutions to show that the proposed algorithms can be tested with regard to their quality.

Complex network analysis:

At the end of the 1990s, complex networks came into the focus of statistical physicists. A complex network represents the interactions of subjects and objects in a complex system. Typical examples of complex systems are social systems and any type of biological cells. A complex network normally represents interactions between only one type of entity of the systems, e.g., the communication between humans in a given social group or the protein-protein interactions of a cell. Based on its network representation, we use methods from social and complex network analysis to find significant subpatterns in it, to look for central nodes in the network or to identify groups of similar subjects and objects.
Examples for such projects by us are the identification of cell growth inhibiting microRNAs in a breast cancer type with especially high lethality [Uhlmann et al., 2012], the insight that social network platforms like facebook can induce quite a lot of information about non-members [Horvat et al., 2012], or the question of how the similarity of two products can be deduced from the buying behavior of products [Zweig, 2010].

Algorithmic graph theory

In some cases, the request of a cooperation partner leads to a new and interesting theoretical problem in graph theory. Examples for this are our finding that finding all cliques in a max-tolerance graphs is easy - based on the question of how to cluster the aligned sequences of a BLAST result [Kaufmann et al., 2006][Lehmann et al., 2006]. Another examples is our work on the conditions under when classical association rules cannot be used [Zweig 2010] and on one-mode projections of bipartite graphs [Kaufmann and Zweig, 2011]. Next to this we have worked on cycle bases [Kavitha et al., 2009] and on a drawing model for streamed graphs [Binucci et al., 2010].

Computer assisted cognitive science

A new focus of the group is on the question of how we can assist psychological experiments by providing efficient algorithms to analyze thousands of results in a short time. Most classic psychological experiments are tun under lab conditions in which participants take part in a standardized environenment and way. This way of experimenting is reliable but time consuming and costly; it also only allows small numbers of participants (less than 100, in most cases). In many cases, the experiment itself is already run on a computer and in principle it is easy to just put it on a website and let many participants try it. However, many of the participants might start an experiment and then go get a coffee or be called by someone. To make the analysis reliable it is necessary to automatically detect these deviant patterns. As the numbers are much higher than usual, it is also much more necessary to have an automated analysis of the results which can be more or less difficult.
We are especially interested in the problem solving capabilities of humans in complex situations; we test it by letting them play games like RushHour.
Our first project in that direction was the analysis of how humans find their way in a highly abstract network [Iyengar et al., 2012]