Wonda VR

Wonda VR makes it easy for content creators and brands to create and share highly engaging VR Video experiences on any VR platform.

Building on its team’s 10-year track-record of success in multimedia production and software development, Wonda VR offers the most intuitive and powerful software dedicated to VR video app publishing on multiple mobile VR platforms (iOS, Android, Oculus Mobile, Google Daydream, and more to come).

With Wonda VR, anyone with a 360° camera and basic video editing skills can now explore the power of VR for entertainment, training or marketing business. With a simple drag and drop, you can add interactive layers instantly to create your own VR experience using your 360° videos. No need for a costly game engine developer. Making prototypes is fast. Publishing across VR platforms becomes affordable and accessible.

Wonda VR was released only 6 months ago and has attracted more than 1,500 professional registered users in 60 countries growing 20% every month. Users include production studios from Melbourne to New York making immersive product presentation for top brand like Telstra, Ebay or Airbus and top educational institutions such as NYU working on exploring the power of interactive storytelling in VR.

With the rise of affordable, easy to use 360° video cameras and well-designed VR headsets such as Google Daydream Viewer, more than 200,000 professional creators are expected to start making VR videos by the end of 2017.

Wonda VR goal is to become the #1 creation software for this new generation of content makers.





Realab brings together talented creators and technophiles to develop new apps and software with 3 words in mind: simple, user-friendly and innovative.

Realab developed a software to create VR movies with multiple storylines, adjusting the story to the viewers’ unconscious behavior: Virtelio.

Boost your VR storytelling in a few simple clicks. No coding required.
With Virtelio you can take your VR storytelling to the next level while saving time and money.








Adok Beta is a multi-user projected surface. Up to six users can directly engage with content in a hands-on, proactive way.

Presentations become more vibrant, discussions more focused, and meetings become a way to actually get work done.





We offer the first fully user-friendly controller for drone and robots piloting.

Forget the complex non-intuitive joysticks and radio controller, now with our smart handband you have all the power to control your drone with the natural gesture of a single hand : YOU are the remote!
It is accurate, intuitive and reliable. If you move your hand left, the drone flies left; up, your drone rises : it's that simple.

Compatible with 80% of brands such as Parrot, DJI and also professionnal receivers, the product we built, "PulsiT" brings immersive control for a whole new piloting experience.

For drone amateurs (professionnal material, such as drone racers), our product is directly connected with the drone, thanks to our embedded transmitter.

For Parrot and DJI drones owners, the PulsiT handband is connected via Bluetooth with your smartphone, which then transmits the signal to the drone. Thus, more than a single hardware, we also build an app to interface it universally.

We are partners with the leader of public drone flights in Paris, he is beginning to use our solution to help people learn piloting and to increase their fun and sensations.






Zumoko is an augmented reality company focusing on solving the 3d model detection and tracking problem.

Our innovative CAD Model Detection and Tracking (CAD DT) software for Microsoft(R) HoloLens(TM) enables us to solve complex detection and tracking problems and engineer advanced AR solutions for the most challenging use-cases. CAD DT is successfully used to to develop advanced AR software for industrial instructions and maintenance use-cases.

As CAD DT is use-case independent, and can be also in other AR use-cases, such as retail, marketing and AR in education.






HoloLamp is the world’s first portable and self-contained device to deliver a glasses-free and hands-free augmented reality experience.

HoloLamp connects to a computer or smartphone and uses a pico projector and multiple sensors to render graphics that give the illusion of 3D objects directly on any surface.

Its accompanying software development kit in Unity enables developers around the world to design various use cases compatible with HoloLamp. The use cases include gaming, smart home, avatar communication, sports broadcasting, education and much more.





Have you ever dreamed of attending the building of the Pyramids, exploring Pompeii before the Vesuvius eruption, or being a guest of Napoleon Sacre at Notre-Dame de Paris ?

We made it possible through Timescope.

TIMESCOPE is the first self-service VR terminal, designed for cultural and touristic places. It feels like a time machine : the user picks a year, and then gets access to a fully immersive 360° experience representing the site where he/she is located, at another period of time.

The two co-founders will showcase the latest version of the product !



Realtime Robotics

Realtime Robotics

We produce interactive robotic installations. We have developed a software based on a 3D real-time physics engine that allows us to control a real environment through a virtual one.

This technology can animate one or several robots in real-time very quickly, it can virtualize the hardware but also manage the graphics, to further connect them to their environment.

We design new products thanks to this approach:
• Collaborative ScreenBOT

The screenBOT brings a new way to explore VR or AR. The user is invited to handle a screen from which the weight is compensated by the robot’s action. The screen is at zero gravity, and acts like a weightless window opened on a virtual space.
We have imagined several applications, like live customization of products, 3D exploring of works of art, video-game controlling, exploration of medical records, and we’re working on many other ones…
• RobotBar

The first collaborative robot bar! The consumer is invited to place an order through an app or a vocal interface. The robots work together or/and with a barman to execute the order. The installation is flexible and can be adapted to different locations’ particular features, i.e. an existing bar…

The heart of the RobotBar is completely virtual thanks to our software, thus the flexibility, but the robots’ capacity to interact with their environment as well.

New products are currently being in a R&D phase, such as :
• Robot-performed Photogrammetry : 3 robots equipped with cameras are moving around a person to produce a 3D model
• Motion parallax : The movements of a robot-operated camera allow to modify a live displayed or shot background
• Robot AirControl : Allow to control robots with gesture



vr Tracker

VR Tracker

VR Tracker has developed a position tracking system to create large scale multiplayer VR experiences. Our technology allow the users to walk in their 3D environment as they walk in a room.

Even better : the tracked object can be used in many different ways, allowing our clients to create their own controllers for very specific applications.

Thanks to VR Tracker, you can create large scale multiplayer experiences at a very affordable cost ! Another great thing is that all the calculations and processing is done in VR Tracker system, no need for an expensive computer, you can add tracking even on Cardboards or Gear VR !

We currently provide the hardware, but also plugins to use VR Tracker directly in Unity, UE4 or Web VR.
Later we will also create a WebVR platform where people will be able to create their 3D scenes directly in VR. They will be able to import their 3D models, drag and drop them in VR using our controllers.

Imagine an architect importing it's building into our platform : he will be able to show it to its client, visit the building and collaborate on changing things directly in VR.



Passer VR

Passer VR

We develop standard software solutions for interactive human character control.

Traditionally human characters are controlled using manually created animations or motion capture. These animations can interact with their environment, but only in a limited way.

With the rise of virtual reality the interaction of human characters with their surroundings becomes a key requirement. People want to be able to work in virtual reality just like in real life: grab objects, operate levers, cooperate with other people in a natural way.

The software components we develop provide an instant solution with the following benefits:
- full body and facial tracking
- supports many body tracking devices including Oculus, HTC Vive, Gear VR, Cardboard, Leap Motion, Kinect and Perception Neuron
- can easily be extended with other (custom) devices
- works with any custom avatar
- traditional animation support
- full physics interaction with the environment
- builtin support for multi-user environments

For first-person avatars, the package can combine all available sensors. It fuses the tracking data to get the best possible result. The tracking data is checked against body joint limitations and physical limitations in the environment. This prevents hands moving through (virtual) objects like tables and it can translate the movements into forces applied to virtual objects.

Third person characters or agents can be animated through traditional animations. These movements can again be checked against physical limitation so that body parts to not enter static objects like walls. It is even possible to have direct interaction with these avatars, taking objects from them, pushing them away or by making eye-contact with them.

Our solutions offer an solution for interactive avatars which is easey to use. Interactive control can be achieve by just adding one script to the character. From then on, the character can do whatever we can do, like a real human.




ARTIFY est une innovation mettant le meilleur de la Technologie au service de l’Art.

Notre solution offre la possibilité de découvrir et apprécier l’Art de manière virtuelle mais aussi d’acheter les œuvres réelles originales.

Nous nous adressons aux Entreprises (B-to-B) et leurs clients/employés. Nous mettons à leur disposition une solution « clé en main » pour :
• Rendre leurs espaces vivants en changeant régulièrement les œuvres montrées
• Communiquer sur leurs valeurs auprès de leurs clients ou équipes
• Constituer ou valoriser leur collection d’art propre 
• Générer des revenus complémentaires

Notre écrin digital consiste en une gamme de Tableaux-Ecrans connectés, Ultra HD, donnant accès à une Artothèque numérisée d’œuvres anciennes et contemporaines. 
Ce catalogue artistique virtuel est constitué en partenariat avec les musées, les fondations, les maisons de vente, les galeries et les artistes eux-mêmes.

Grâce à une application intuitive, nos clients explorent une collection de tableaux, dessins, gravures et photographies, sélectionnés, numérisés et enrichis de contenus par des experts dans leur domaine. 
L’ensemble des œuvres numérisées et leurs contenus (textes, audio, vidéo) sont stockés sur le cloud.

Séduits au quotidien par la qualité de l’image de l’œuvre, nos membres pourront acquérir en un clic l’œuvre originale de leurs coups de cœur digitaux.





Opuscope is developing Holostoria, the first software that allows creative people to easily build virtual and augmented experiences without technical knowledge. Scheduled release is February 2017.

Designed for Marketing and Communication departments of companies, Holostoria allows companies to highlight their content through a 3D scenography made with a VR headset, a MR device or a PC.






HYPERSUIT develops a project of a motion dynamic experience using virtual reality.

Our dream is to make people fly through immersive technology, and multiple characters embodiment (such as Iron Man, Astronaut, Bird or a Manta ray…)

Combining a revolutionary interactive hardware and dedicated games; we enable people to feel extreme sensations that are usually reserved to the most reckless of us in a completely safe way.

Our simulator gives you the ability to completely interact with the environment through three key technological pillars:
1. Fully articulated mechanical extensions that allow the player to interact with his arms in the virtual world.
2. Three engines are located at the bottom of the simulator, synchronized with the arms and the game. They move the simulator on 3 degrees of freedom (Pitch / Roll / Up&Down)
3. The Fan located in front of the player is synchronized with the whole structure, blowing air depending of the speed of the player in the game.

This whole synchronized experience enables the players to feel completely immersed in the universe he chose.

  • Share :