Author Archives: griff

LADIO project will follow up on POPART

The Horizon 2020 project proposal LADIO: Live Action Data Input and Output has been accepted.

Like POPART before, LADIO is an innovation action of 18 months. This time, we focus on maximizing the collection of metadata on film sets in order to simplify the collaboration of post-production facilities.
The project will have a strong vision aspect, especially since the Technical University of Prague is joining the POPART team, but Media will concentrate on aspects of storage and transmission. Stay tuned for more news form LADIO.

Papers accepted at MMSys and associated workshops

With the final notification deadlines past, we can report that MPG is going to return to MMSys with a nice number of papers, datasets and demos.

MMSys

  • A High-Precision, Hybrid GPU, CPU and RAM Power Model for Generic Multimedia Workloads
    Kristoffer Robin Stokke, Håkon Kvale Stensland, Carsten Griwodz, Pål Halvorsen

NOSSDAV

  • Device Lending in PCI Express Networks
    Lars Bjørlykke Kristiansen, Jonas Markussen, Håkon Kvale Stensland, Michael Riegler, Hugo Kohmann, Friedrich Seifert, Roy Nordstrøm, Carsten Griwodz, Pål Halvorsen

MMSys Special Session on AR

  • Robustness of 3D Point Positions to Camera Baselines in Markerless AR Systems
    Deepak Dwarakanath, Carsten Griwodz, Pål Halvorsen

MMSys Dataset papers

  • Right inflight? A dataset for exploring the automatic prediction of movies suitable for a watching situation
    Michael Riegler, Martha Larson, Concetto Spampinato, Pål Halvorsen, Mathias Lux, Jonas Markussen, Konstantin Pogorelov, Carsten Griwodz, Håkon Kvale Stensland
  • Heimdallr: A dataset for sport analysis
    Michael Riegler, Duc-Tien Dang-Nuyen, Bård Winther, Carsten Griwodz, Konstantin Pogorelov, Pål Halvorsen

MMSys Demo papers

  • Computer Aided Disease Detection System for Gastrointestinal Examinations
    Michael Riegler, Konstantin Pogorelov, Jonas Markussen, Mathias Lux, Håkon Kvale Stensland, Thomas de Lange, Carsten Griwodz, Pål Halvorsen, Dag Johansen, Peter Thelin Schmidt, Sigrun L. Eskeland
  • Immersed gaming in Minecraft
    Milan Loviska, Otto Krause, Herman A. Engelbrecht, Jason B. Nel, Gregor Schiele, Alwyn Burger, Stephan Schmeißer, Christopher Cichiwskyj, Lilian Calvet, Carsten Griwodz, Pål Halvorsen
  • Ultra-Low Delay for All: Live Experience, Live Analysis
    Olga Bondarenko, Koen De Schepper, Ing-Jyh Tsang, Bob Briscoe, Andreas Petlund, Carsten Griwodz
  • Efficient Processing of Videos in a Multi Auditory Environment Using Device Lending of GPUs
    Konstantin Pogorelov, Michael Riegler, Jonas Markussen, Håkon Kvale Stensland, Pål Halvorsen, Carsten Griwodz, Sigrun Losada Eskeland, Thomas de Lange

C2Tag, a robust and accurate fiducial marker system for image-based localization from challenging images

Our papers on C2Tags has been accepted for publication in CVPR 2016.

C2Tags are a new approach to visual marker tracking that has been designed for localization in challenging environments, such as the film sets in our H2020 project POPART.

Like a few earlier papers, C2Tags consist of concentric rings, whose position in space has been reconstructed before tracking occurs. The new contribution of C2Tags lies in the detection algorithms, which enable it to tolerate considerable partial occlusion and intense motion blur, and still locate and subsequently identify the marker.

Examples of C2Tag resilience

Examples of C2Tag resilience

For POPART, where we use the C2Tags to track film camera movement on film sets where natural feature detectors fail or faster processing is required, this ability to handle fast motion and occlusion is a major game changer.

Paper Abstract

Fiducials offer a reliable detection and identification of images of known

planar figures in a view. They are used in a wide range of applications, especially when a reliable reference is needed to, e.g., estimate the camera movement in cluttered or textureless environments.

A fiducial designed for such applications must be robust to partial occlusions, varying

distances and angles of view, and fast camera movements.

In this paper, we present a new fiducial system, whose markers consist of concentric

circles: relying on the their geometric properties, the proposed system

allows to accurately detect the position of the image of the circles’ common center.

Moreover, the different thickness of its rings can be used to encode the information

associated to the marker, thus allowing the univocal identification of the marker.

We demonstrate that the proposed fiducial system can be detected in very

challenging conditions and the experimental results show that it outperforms other recent fiducial systems.

Trailer of Third Life premiere available

Artists Otto Krause and Milan Loviška took to the stage in three public performances of the Third Life Project, while their research collaborators from Stellenbosch University in South Africa, University of Duisburg-Essen in Germany and Simula Research Lab and Norway followed closely from the around the stage, to step in and assist on either the virtual or real-world side of their performance.

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. – 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Fotos: http://eSeL.at

High Performance Computing meets Performance Art

In their performative lecture the artists together with an international team of experts exploit technology and employ artistic vision to blur the lines between human beings and machines and between reality and imagination. They explore up-to-date possibilities of development of an avatar performance for a real life audience, which operates within mixed realities (real and “second life”) and coessentially aspires to open the door to the “third life”, where virtuality can transgress directly into reality.

With the use of a “smart stage” they address the new performative possibilities of virtual environments that aren’t limited or constrained by the local space that the physical bodies inhabit. This unique interface of a simulated virtual world, Internet of Things and novel tracking technologies allow virtual characters to perform activities in the real world, whereas activities of performers in the real world enable changes in the virtual world. The notion of third life is manifested here not only in the synchronous interconnection of the virtual and the real but also in their divergence alike, and brings up for question and re-examination what a body is, how a body operates and whether that body is alive or dead, real or virtual.

Third Life Project, initiated in early 2014, implements artistic and scientific research and is devised in the ongoing, networked collaboration across national boundaries.

See more photos and a video summary from the 3 performances at http://thirdlifeserver.org/media.html

Premiere @ WUK Vienna, 08 October 2015.

Concept/Dramaturgy/Scenography/Performance: Otto Krause & Milan Loviška
Virtual environments of Minecraft: Otto Krause alias Aproktas
Minecraft expertise and gesture control: Herman Engelbrecht, Jason Bradley Nel (Stellenbosch University/MIH Medialab, South Africa)
Tracking: Carsten Griwodz, Lilian Calvet (Simula Research Lab & LABO Mixed Realities, Norway)
Cyberphysical devices and Non-Player-Characters: Gregor Schiele, Alwyn Burger, Stephan Schmeißer, Christopher Cichiwskyj (University of Duisburg-Essen, Germany)
Server: René Griessl (Bielefeld University, Germany)

A co-production of Territorium – Kunstverein and WUK Performing Arts in Vienna.

With the kind support of the City of Vienna’s Department of Cultural Affairs, the Arts Division; and the Arts and Culture Division of the Federal Chancellery of Austria. With the contribution from the FiPS project funded from the EU’s 7th Framework Programme for research, technological development and demonstration under grant agreement no 609757. Thanks to LABO Mixed Realities in Norway, the EU project POPART (Previz for On-set Production – Adaptive Realtime Tracking) funded under grant agreement no 644874, the Bielefeld University in Germany and the Stellenbosch University in South Africa.WUK:

 

 

Olvwm on Ubuntu

Olvwm (the OpenLook virtual window manager) is a positively antique window manager originally for SunOS, which I liked very much for its focus-strictly-under-mouse behaviour, thin window borders and pop-over window sliders. OpenLook was deprecated when the (in my opinion: ulgy and bloated) CDE desktop conquered the world, but modern Linux distros are capable of supporting the antique OpenLook applications again. So I gave it a try.

Installing is as easy as “apt-get install olvwm”. In /etc/lightdm/lighdm.conf, I’ve got the block

[SeatDefaults]
user-session=olvwm
autologin-user=myname
autologin-user-timeout=0
allow-guest=false
greeter-hide-users=true
greeter-show-manual-login=true

In /usr/share/xsessions, I’ve got the file olvwm.desktop, which contains

[Desktop Entry]
Name=Open Look
Exec=olvwm -f
Type=Application

That’s all it takes, and you have a pretty blank, autostarting window manager.

To start applications, you can add commands to your right-button context menu by editing <tt>$HOME/.openwin-menu</tt>. Since I’m totally happy with xterm, mine’s here:

"Root Menu" TITLE
"Shells" MENU
        "Shells" TITLE
        "bash"  exec x-terminal-emulator -title bash -e bash
        "ksh"   exec x-terminal-emulator -title ksh -e ksh
        "tcsh"  exec x-terminal-emulator -title tcsh -e tcsh
"Shells" END PIN
"Clock" MENU
        "Clock" TITLE
        "xclock"        exec xclock
"Clock" END PIN
"Browsers" MENU
        "Browsers" TITLE
        "Chrome"        exec google-chrome
        "Firefox"       exec firefox
"Browsers" END PIN
"Utilities" MENU
        "Clipboard..."  exec xclipboard
        "Refresh" DEFAULT                       REFRESH
        "Window Controls" MENU
                "Open/Close" DEFAULT    OPEN_CLOSE_SELN
                "Full/Restore Size"     FULL_RESTORE_SIZE_SELN
                "Back"                  BACK_SELN
                "Quit"                  QUIT_SELN
        "Window Controls" END PIN
        "Window Menu..."                        WINMENU
        "Restart olvwm"                         RESTART
"WM Utilities" END PIN
"Exit from X..."                                EXIT

With its tiny amount of overhead, waiting for the desktop is not an issue. According to a comparison, olvwm consumes only 1.2MB of main memory, compared to Gnome’s 150 and KDE’s 200.

Third Life premieres at the WUK in Wien

Artists Otto Krause and Milan Loviška took to the stage in three public performances of the Third Life Project, while their research collaborators from Stellenbosch University in South Africa, University of Duisburg-Essen in Germany and Simula Research Lab and Norway followed closely from the around the stage, to step in and assist on either the virtual or real-world side of their performance.

WUK: Third Life (8.10. – 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Fotos: http://eSeL.at

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. - 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Foto: http://eSeL.at

WUK: Third Life

WUK: Third Life (8.10. – 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Fotos: http://eSeL.at

High Performance Computing meets Performance Art

In their performative lecture the artists together with an international team of experts exploit technology and employ artistic vision to blur the lines between human beings and machines and between reality and imagination. They explore up-to-date possibilities of development of an avatar performance for a real life audience, which operates within mixed realities (real and “second life”) and coessentially aspires to open the door to the “third life”, where virtuality can transgress directly into reality.

With the use of a “smart stage” they address the new performative possibilities of virtual environments that aren’t limited or constrained by the local space that the physical bodies inhabit. This unique interface of a simulated virtual world, Internet of Things and novel tracking technologies allow virtual characters to perform activities in the real world, whereas activities of performers in the real world enable changes in the virtual world. The notion of third life is manifested here not only in the synchronous interconnection of the virtual and the real but also in their divergence alike, and brings up for question and re-examination what a body is, how a body operates and whether that body is alive or dead, real or virtual.

Third Life Project, initiated in early 2014, implements artistic and scientific research and is devised in the ongoing, networked collaboration across national boundaries.

Premiere @ WUK Vienna, 08 October 2015.

Concept/Dramaturgy/Scenography/Performance: Otto Krause & Milan Loviška
Virtual environments of Minecraft: Otto Krause alias Aproktas
Minecraft expertise and gesture control: Herman Engelbrecht, Jason Bradley Nel (Stellenbosch University/MIH Medialab, South Africa)
Tracking: Carsten Griwodz, Lilian Calvet (Simula Research Lab & LABO Mixed Realities, Norway)
Cyberphysical devices and Non-Player-Characters: Gregor Schiele, Alwyn Burger, Stephan Schmeißer, Christopher Cichiwskyj (University of Duisburg-Essen, Germany)
Server: René Griessl (Bielefeld University, Germany)

A co-production of Territorium – Kunstverein and WUK Performing Arts in Vienna.

With the kind support of the City of Vienna’s Department of Cultural Affairs, the Arts Division; and the Arts and Culture Division of the Federal Chancellery of Austria. With the contribution from the FiPS project funded from the EU’s 7th Framework Programme for research, technological development and demonstration under grant agreement no 609757. Thanks to LABO Mixed Realities in Norway, the EU project POPART (Previz for On-set Production – Adaptive Realtime Tracking) funded under grant agreement no 644874, the Bielefeld University in Germany and the Stellenbosch University in South Africa.WUK: Third Life (8.10. – 10.10.2015, Generalprobe) http://esel.cc/WUK_3rd-life | Fotos: http://eSeL.at

 

 

OpenVQ – an objective quality assessment tool

OpenVQ is a video quality assessment toolkit. It is the result of a two-person master thesis that we initiated to fix the general trend of excusing video quality estimation with PSNR by providing a tool that comes closer to the standardized objective metrics of VQEG.

It can be accessed here: https://bitbucket.org/mpg_code/openvq

OpenVQ provides anyone interested in video quality assessment with a toolkit that a) provides ready to use video quality metric implementations; and b) makes it easy to implement other video quality metrics.

Version 1 of OpenVQ contains the following metrics:

  • OPVQ – The Open Perceptual Video Quality metric
  • PSNR – Peak signal-to-noise-ratio (full reference)
  • SSIM – Structural similarity index (full reference)

While everybody knows PSNR and SSIM, we need to explain that OPVQ is an interpretation of ITU.T J.247 without temporal alignment (because we are aware of several patents covering temporal alignment). It provides good results for resolutions up to VGA. It was written without access to any implementation of PEVQ and without access to the datasets that were used to perform the VQEG’s competition that led to the standardization of J.247. It was also necessary to interpret the formal descriptions contained in J.247 loosely, because essential formulas were flawed.
OPVQ does perform well when it is tested on the datasets
IRCCyN datasets, and we hope that independent parties confirm a good performance.

« Older Entries