Skip to content

Commit

Permalink
initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
JonasSchult committed Oct 5, 2018
0 parents commit f426658
Show file tree
Hide file tree
Showing 48 changed files with 4,609 additions and 0 deletions.
19 changes: 19 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
.idea
*__pycache__*
*.obj
.history*
*.pyc
cache/
log/
log2/
*.npy
logs/
*.so
*.sh
3d-semantic-segmentation.wiki/

!experiments/

experiments/*
!experiments/iccvw_paper_2017/
*.out
21 changes: 21 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2018 Jonas Schult

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
101 changes: 101 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
# Exploring Spatial Context for 3D Semantic Segmentation of Point Clouds
Created by Francis Engelmann, Theodora Kontogianni, Alexander Hermans, Jonas Schult and Bastian Leibe
from RWTH Aachen University.

![prediction example](doc/exploring_header.png?raw=True "dfdf")

### Introduction
This work is based on our paper
[Exploring Spatial Context for 3D Semantic Segmentation of Point Clouds](https://www.vision.rwth-aachen.de/media/papers/PID4967025.pdf),
which appeared at the IEEE International Conference on Computer Vision (ICCV) 2017, 3DRMS Workshop.

You can also check our [project page](https://www.vision.rwth-aachen.de/page/3dsemseg) for further details.

Deep learning approaches have made tremendous progress in the field of semantic segmentation over the past few years. However, most current approaches operate in the 2D image space. Direct semantic segmentation of unstructured 3D point clouds is still an open research problem. The recently proposed PointNet architecture presents an interesting step ahead in that it can operate on unstructured point clouds, achieving decent segmentation results. However, it subdivides the input points into a grid of blocks and processes each such block individually. In this paper, we investigate the question how such an architecture can be extended to incorporate larger-scale spatial context. We build upon PointNet and propose two extensions that enlarge the receptive field over the 3D scene. We evaluate the proposed strategies on challenging indoor and outdoor datasets and show improved results in both scenarios.

In this repository, we release code for training and testing various pointcloud semantic segmentation networks on
arbitrary datasets.

### Citation
If you find our work useful in your research, please consider citing:

@inproceedings{3dsemseg_ICCVW17,
author = {Francis Engelmann and
Theodora Kontogianni and
Alexander Hermans and
Bastian Leibe},
title = {Exploring Spatial Context for 3D Semantic Segmentation of Point Clouds},
booktitle = {{IEEE} International Conference on Computer Vision, 3DRMS Workshop, {ICCV}},
year = {2017}
}


### Installation

Install <a href="https://www.tensorflow.org/get_started/os_setup" target="_blank">TensorFlow</a>.
The code has been tested with Python 3.6 and TensorFlow 1.8.

### Usage
In order to get more representative blocks, it is encouraged to uniformly downsample the original point clouds.
This is done via the following script:

python tools/downsample.py --data_dir path/to/dataset --cell_size 0.03

This statement will produce pointclouds where each point will be representative for its 3cm x 3cm x 3cm neighborhood.

To train/test a model for semantic segmentation on pointclouds, you need to run:

python run.py --config path/to/config/file.yaml

Detailed instruction of the structure for the yaml config file can be found in the wiki.
Additionally, some example configuration files are given in the folder `experiments`.

Note that the final evaluation is done on the full sized point clouds using k-nn interpolation.

### Reproducing the scores of our paper for stanford indoor 3d

#### Downloading the data set
First of all, Stanford Large-Scale 3D Indoor Spaces Dataset has to be downloaded.
Follow the instructions [here](https://docs.google.com/forms/d/e/1FAIpQLScDimvNMCGhy_rmBA2gHfDu3naktRm6A8BPwAWWDv-Uhm6Shw/viewform?c=0&w=1).
The aligned version 1.2 is used for our results.

#### Producing numpy files from the original dataset
Our pipeline cannot handle the original file type of s3dis. So, we need to convert it to npy files.
Note that Area_5/hallway_6 has to be fixed manually due to format inconsistencies.

python tools/prepare_s3dis.py --input_dir path/to/dataset --output_dir path/to/output

#### Downsampling for training
Before training, we downsampled the pointclouds.

python tools/downsample.py --data_dir path/to/dataset --cell_size 0.03

#### Training configuration scripts
Configuration files for all experiments are located in `experiments/iccvw_paper_2017/*`. For example, they can be
launched as follows:

python run.py --config experiments/iccvw_paper_2017/s3dis_mscu/s3dis_mscu_area_1.yaml

The above script will run our multi scale consolidation unit network on stanford indoor 3d with test area 1.

#### Evaluating on full scale point clouds
Reported scores on the dataset are based on the full scale pointclouds.
In order to do so, we need to load the trained model and set the `TEST` flag.

Replace `modus: TRAIN_VAL` with

```yaml
modus: TEST
model_path: 'path/to/trained/model/model_ckpts'
```
which is located in the log directory specified for training.
### VKitti instructions
* coming soon ...
### Trained models for downloading
* Coming soon ...
### License
Our code is released under MIT License (see LICENSE file for details).
45 changes: 45 additions & 0 deletions batch_generators/ReadWriteLock.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
"""
copyright by:
https://www.safaribooksonline.com/library/view/python-cookbook/0596001673/ch06s04.html
"""

import threading


class ReadWriteLock:
""" A lock object that allows many simultaneous "read locks", but
only one "write lock." """

def __init__(self):
self._read_ready = threading.Condition(threading.Lock())
self._readers = 0

def acquire_read(self):
""" Acquire a read lock. Blocks only if a thread has
acquired the write lock. """
self._read_ready.acquire()
try:
self._readers += 1
finally:
self._read_ready.release()

def release_read(self):
""" Release a read lock. """
self._read_ready.acquire()
try:
self._readers -= 1
if not self._readers:
self._read_ready.notifyAll()
finally:
self._read_ready.release()

def acquire_write(self):
""" Acquire a write lock. Blocks until there are no
acquired read or write locks. """
self._read_ready.acquire()
while self._readers > 0:
self._read_ready.wait()

def release_write(self):
""" Release a write lock. """
self._read_ready.release()
4 changes: 4 additions & 0 deletions batch_generators/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
from batch_generators.batch_generator import *
from batch_generators.center_batch_generator import *
from batch_generators.multi_scale_batch_generator import *
from batch_generators.neighboring_grid_batch_generator import *
156 changes: 156 additions & 0 deletions batch_generators/batch_generator.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,156 @@
from abc import *
import numpy as np
import tensorflow as tf
import itertools
from tools.lazy_decorator import *


class BatchGenerator(ABC):
"""
Abstract base class for batch generators providing the code for parallel creation of batches
"""

def __init__(self, dataset, batch_size, num_points, augmentation):
"""
:param dataset: dataset object
:type dataset: Dataset
:param num_points: number of points in a batch
:type num_points: int
"""
self.dataset = dataset
self._num_points = num_points
self._batch_size = batch_size
self._augmentation = augmentation

@lazy_property
def handle_pl(self):
# Handle for datasets
return tf.placeholder(tf.string, shape=[], name='handle_training_test')

@lazy_property
def next_element(self):
iterator = tf.data.Iterator.from_string_handle(self.handle_pl, self.dataset_train.output_types)
return iterator.get_next()

@lazy_property
def dataset_train(self):
# Create dataset for training
dataset_train = tf.data.Dataset.from_generator(self._next_train_index, tf.int64, tf.TensorShape([]))
dataset_train = dataset_train.map(self._wrapped_generate_train_blob, num_parallel_calls=8)
dataset_train = dataset_train.batch(self._batch_size)
return dataset_train.prefetch(buffer_size=self._batch_size * 1)

@lazy_property
def iterator_train(self):
return self.dataset_train.make_one_shot_iterator()

@lazy_property
def iterator_test(self):
# Create dataset for testing
dataset_test = tf.data.Dataset.from_generator(self._next_test_index, tf.int64, tf.TensorShape([]))
dataset_test = dataset_test.map(self._wrapped_generate_test_blob, num_parallel_calls=8)
dataset_test = dataset_test.batch(self._batch_size)
dataset_test = dataset_test.prefetch(buffer_size=self._batch_size * 1)
return dataset_test.make_one_shot_iterator()

@property
def batch_size(self):
return self._batch_size

@property
def num_points(self):
return self._num_points

@property
def input_shape(self):
return self.pointclouds_pl.get_shape().as_list()

@lazy_property
@abstractmethod
def pointclouds_pl(self):
raise NotImplementedError('Should be defined in subclass')

@lazy_property
@abstractmethod
def labels_pl(self):
raise NotImplementedError('Should be defined in subclass')

@lazy_property
@abstractmethod
def mask_pl(self):
raise NotImplementedError('Should be defined in subclass')

@lazy_property
@abstractmethod
def cloud_ids_pl(self):
raise NotImplementedError('Should be defined in subclass')

@lazy_property
@abstractmethod
def point_ids_pl(self):
raise NotImplementedError('Should be defined in subclass')

def _next_train_index(self):
"""
get next index in training sample list (e.g. containing [pointcloud_id, center_x, center_y])
Take care that list is shuffled for each epoch!
:return: next index for training
"""
for i in itertools.cycle(range(self.train_sample_idx.shape[0])):
yield (i)

def _next_test_index(self):
"""
get next index in test sample list (e.g. containing [pointcloud_id, center_x, center_y])
Take care that list is shuffled for each epoch!
:return: next index for test
"""
for i in itertools.cycle(range(self.test_sample_idx.shape[0])):
yield (i)

def _wrapped_generate_train_blob(self, index):
return tf.py_func(func=self._generate_blob,
# pos_id, train, aug_trans
inp=[index, True, self._augmentation],
# data labels mask
Tout=(tf.float32, tf.int8, tf.int8, tf.int32, tf.int64),

name='generate_train_blob')

def _wrapped_generate_test_blob(self, index):
return tf.py_func(func=self._generate_blob,
# pos_id, train, aug_trans
inp=(index, False, False),
# data labels mask cloud_id point_id
Tout=(tf.float32, tf.int8, tf.int8, tf.int32, tf.int64),
name='generate_test_blob')

@property
def num_train_batches(self):
return self.train_sample_idx.shape[0] // self._batch_size

@property
def num_test_batches(self):
return int(np.ceil(self.test_sample_idx.shape[0] // self._batch_size))

@property
@abstractmethod
def test_sample_idx(self):
"""
:rtype: ndarray
:return:
"""
raise NotImplementedError('Should be defined in subclass')

@property
@abstractmethod
def train_sample_idx(self):
"""
:rtype: ndarray
:return:
"""
raise NotImplementedError('Should be defined in subclass')

@abstractmethod
def _generate_blob(self, index, train=True, aug_trans=True):
raise NotImplementedError('Should be defined in subclass')
Loading

0 comments on commit f426658

Please sign in to comment.