Nebiker, Stephan

Lade...
Profilbild
E-Mail-Adresse
Geburtsdatum
Projekt
Organisationseinheiten
Berufsbeschreibung
Nachname
Nebiker
Vorname
Stephan
Name
Nebiker, Stephan

Suchergebnisse

Gerade angezeigt 1 - 7 von 7
  • Publikation
    AI-based 3D detection of parked vehicles on a mobile mapping platform using edge computing
    (2022) Meyer, Jonas; Blaser, Stefan; Nebiker, Stephan [in: The international archives of the photogrammetry, remote sensing and spatial information sciences]
    In this paper we present an edge-based hardware and software framework for the 3D detection and mapping of parked vehicles on a mobile mapping platform for the use case of on-street parking statistics. First, we investigate different point cloud-based 3D object detection methods on our extremely dense and noisy depth maps obtained from low-cost RGB-D sensors to find a suitable object detector and determine the optimal preparation of our data. We then retrain the chosen object detector to detect all types of vehicles, rather than standard cars only. Finally, we design and develop a software framework integrating the newly trained object detector. By repeating the parking statistics of our previous work (Nebiker et al., 2021), our software is tested regarding the detection accuracy. With our edge-based framework, we achieve a precision and recall of 100% and 98% respectively on any parking configuration and vehicle type, outperforming all other known work on on-street parking statistics. Furthermore, our software is evaluated in terms of processing speed and volume of generated data. While the processing speed reaches only 1.9 frames per second due to limited computing resources, the amount of data generated is just 0.25 KB per frame.
    04B - Beitrag Konferenzschrift
  • Publikation
    Image-based reality-capturing and 3D modelling for the creation of VR cycling simulations
    (Copernicus, 17.06.2021) Wahbeh, Wissam; Ammann, Manuela; Nebiker, Stephan; van Eggermond, Michael; Erath, Alexander; Wahbeh, Wissam [in: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences]
    With this paper, we present a novel approach for efficiently creating reality-based, high-fidelity urban 3D models for interactive VR cycling simulations. The foundation of these 3D models is accurately georeferenced street-level imagery, which can be captured using vehicle-based or portable mapping platforms. Depending on the desired type of urban model, the street-level imagery is either used for semi-automatically texturing an existing city model or for automatically creating textured 3D meshes from multi-view reconstructions using commercial off-the-shelf software. The resulting textured urban 3D model is then integrated with a real-time traffic simulation solution to create a VR framework based on the Unity game engine. Subsequently, the resulting urban scenes and different planning scenarios can be explored on a physical cycling simulator using a VR helmet or viewed as a 360-degree or conventional video. In addition, the VR environment can be used for augmented reality applications, e.g., mobile augmented reality maps. We apply this framework to a case study in the city of Berne to illustrate design variants of new cycling infrastructure at a major traffic junction to collect feedback from practitioners about the potential for practical applications in planning processes.
    04B - Beitrag Konferenzschrift
  • Publikation
    Open urban and forest datasets from a high-performance mobile mapping backpack. A contribution for advancing the creation of digital city twins
    (International Society of Photogrammetry and Remote Sensing, 2021) Blaser, Stefan; Meyer, Jonas; Nebiker, Stephan [in: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences]
    With this contribution, we describe and publish two high-quality street-level datasets, captured with a portable high-performance Mobile Mapping System (MMS). The datasets will be freely available for scientific use. Both datasets, from a city centre and a forest represent area-wide street-level reality captures which can be used e.g. for establishing cloud-based frameworks for infrastructure management as well as for smart city and forestry applications. The quality of these data sets has been thoroughly evaluated and demonstrated. For example, georeferencing accuracies in the centimetre range using these datasets in combination with image-based georeferencing have been achieved. Both high-quality multi sensor system street-level datasets are suitable for evaluating and improving methods for multiple tasks related to high-precision 3D reality capture and the creation of digital twins. Potential applications range from localization and georeferencing, dense image matching and 3D reconstruction to combined methods such as simultaneous localization and mapping and structure-from-motion as well as classification and scene interpretation. Our dataset is available online at: https://www.fhnw.ch/habg/bimage-datasets
    04B - Beitrag Konferenzschrift
  • Publikation
    Image-based orientation determination of mobile sensor platforms
    (International Society of Photogrammetry and Remote Sensing, 2021) Hasler, Oliver; Nebiker, Stephan [in: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences]
    Abstract. Estimating the pose of a mobile robotic platform is a challenging task, especially when the pose needs to be estimated in a global or local reference frame and when the estimation has to be performed while the platform is moving. While the position of a platform can be measured directly via modern tachymetry or with the help of a global positioning service GNSS, the absolute platform orientation is harder to derive. Most often, only the relative orientation is estimated with the help of a sensor mounted on the robotic platform such as an IMU, with one or multiple cameras, with a laser scanner or with a combination of any of those. Then, a sensor fusion of the relative orientation and the absolute position is performed. In this work, an additional approach is presented: first, an image-based relative pose estimation with frames from a panoramic camera using a state-of-the-art visual odometry implementation is performed. Secondly, the position of the platform in a reference system is estimated using motorized tachymetry. Lastly, the absolute orientation is calculated using a visual marker, which is placed in the space, where the robotic platform is moving. The marker can be detected in the camera frame and since the position of this marker is known in the reference system, the absolute pose can be estimated. To improve the absolute pose estimation, a sensor fusion is conducted. Results with a Lego model train as a mobile platform show, that the trajectory of the absolute pose calculated independently with four different markers have a deviation < 0.66 degrees 50% of the time and that the average difference is < 1.17 degrees. The implementation is based on the popular Robotic Operating System ROS.
    04B - Beitrag Konferenzschrift
  • Publikation
    Implementation and first evaluation of an indoor mapping application using smartphones and frameworks
    (2019) Hasler, Oliver; Blaser, Stefan; Nebiker, Stephan [in: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences]
    In this paper, we present the implementation of a smartphone-based indoor mobile mapping application based on an augmented reality (AR) framework and a subsequent performance evaluation in demanding indoor environments. The implementation runs on Android and iOS devices and demonstrates the great potential of smartphone-based 3D mobile mapping. The application includes several functionalities such as device tracking, coordinate, and distance measuring as well as capturing georeferenced imagery. We evaluate our prototype system by comparing measured points from the tracked device with ground control points in an indoor environment with two different campaigns. The first campaign consists of an open, one-way trajectory whereas the second campaign incorporates a loop closure. In the second campaign, the underlying AR framework successfully recognized the start location and correctly repositioned the device. Our results show that the absolute 3D accuracy of device tracking with a standard smartphone is around 1% of the travelled distance and that the local 3D accuracy reaches sub-decimetre level.
    04B - Beitrag Konferenzschrift
  • Publikation
    Accurate visual localization in outdoor and indoor environments exploiting 3D image spaces as spatial reference
    (International Society for Photogrammetry and Remote Sensing, 2018) Rettenmund, Daniel; Fehr, Markus; Cavegn, Stefan; Nebiker, Stephan [in: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences]
    In this paper, we present a method for visual localization and pose estimation based on 3D image spaces. The method works in indoor and outdoor environments and does not require the presence of control points or markers. The method is evaluated with different sensors in an outdoor and an indoor test field. The results of our research show the viability of single image localization with absolute position accuracies at the decimetre level for outdoor environments and 5 cm or better for indoor environments. However, the evaluation also revealed a number of limitations of single image visual localization in real-world environments. Some of them could be addressed by an alternative AR-based localization approach, which we also present and compare in this paper. We then discuss the strengths and weaknesses of the two approaches and show possibilities for combining them to obtain accurate and robust visual localization in an absolute coordinate frame.
    04B - Beitrag Konferenzschrift
  • Publikation
    Drohnenbasierte Umweltbeobachtung und Kartierung basierend auf einem Virtuellen Globus
    (2010) Eugster, Hannes; Flückiger, Kevin; Nebiker, Stephan
    04B - Beitrag Konferenzschrift