(If you are looking for code to predict friction from vision, please check this project instead)
This page contains a collection of datasets for benchmarking algorithms for visual prediction of friction, as well as understanding human perception of friction.
The datasets are described in the following publication:
You can download the data here: Friction from vision datasets. Please cite the publication above if you use the datasets.
Was originally targeted at studying algorithmic and human performance at the task, for robot locomotion applications.
The dataset consists of several measurements on a set of 43 walkable surfaces:
Was originally targeted at studying human perception of friction from vision. It is based on a subset of the OpenSurfaces dataset of Bell et al. [1] and the additional texture attributes of Cimpoi et al. [2]. On top of the image, label and gloss data they provide, we obtained human (visual) judgements of friction.
The dataset consists of several measurements on a set of 96 walkable surfaces:
References:
Was originally targeted at understanding whether text mining of large text sources (e.g. Wikipedia) can predict humans’ intuitive ranking of materials by friction. Brandao et al. 2016 shows some promising results.
The dataset consists of a set of 19 different materials, ordered from most to least slippery by several (also 19) human subjects. The ranking was made through an online survey without image support, from material names only.