The same model can be used for learning the dynamics of different sensors range-finder, camera, field sampler. Our latest work on event-based vision was featured in the MIT News. Welcome to Henri Rebecq as a new PhD student in our lab! Our latest work on failure recovery from aggressive flight and how to launch a quadrotor by throwing it in the air! Present, Future, and the Robust-Perception Age.
Our mission is to develop autonomous machines that can navigate all by themselves using only onboard cameras, without relying on external infrastructure, such as GPS or motion capture systems. Stress and procrastination levels in the last 7 days updated Tue, 03 Jan No commercial reproduction, distribution, display or performance rights in this work are provided. Our active volumetric reconstruction software framework is now released open source. For more info and applications, please see here. Read the paper here.
The goal of bootstrapping is creating agents that are able to learn “everything” from scratch, including a torque-to-pixels models for its robotic body.
With 29 authors, we made the record for a robotics conference. Our latest work on search and rescue robotics is a system for training a terrain classifier “on-the-spot” in only 60 seconds. From time to time, project supervisors will develop custom student research projects to fit with a student’s particular interests or skills.
Read the full article. Check out the article here. September 27, Agile Flight through Narrow Gaps!
Welcome to Flavio Fontana as a new PhD student in our lab! What some impressions from our demonstrations!
Robot Perception Group
Check the news here Italian. Check out the speakers lineup on the workshop website.
The rhesis code is released under a GPLv3 licence. Melanie ZeilingerSomil Bansal. We are happy to start The List of Event-based Vision Resourceswhich contains links to event camera devices as well as papers, videos, code, etc. Welcome to Alessandro Simovic as a new drone engineer in our lab! An accurate closed-form estimate of ICP’s covariance.
I imagined that my robots would look like the golden humanoid C-3PO from Star Wars, with clumsy arms and a robotic voice. Our lab was featured in the World Robotics report of the International Federation of Robotics censsi outstanding profile of research lab in service robotics. Our latest work on probabilistic, monocular dense reconstruction in real time.
Watch here the video preview. For more details, see the IROS’14 paper thesiss the accompanying video. Design problems can be interconnected together to create “co-design problems”, which describe possibly recursive co-design constraints among subsystems.
You can experiment with the same code and materials that our students use in the Autonomous Mobility on Demand and Control Systems courses. ESIM readily provides ground truth depth and optic flow maps.
Theses & Semester Projects – Institute for Dynamic Systems and Control | ETH Zurich
Most of my papers come with software and datasets in the spirit of reproducible research subject to time constraints July 14, Binaries for SVO 2. Who is the Beauty and who is the Beast? The software corresponding to the paper SVO: We describe several improvements that ameliorate performance: January 9, Our paper on efficient decentralized visual place recognition was accepted to RA-L! The lifetime of an event is the time that it takes for the moving brightness gradient causing the event to travel a distance of 1 pixel.
Coherent measurements selection via l1 relaxation: We performed a live quadrotor demo at the Zurich Kunsthalle during the Langen Nacht der Zurcher Museenas part of the Ways of Thinking show, in front of more than people. Further details are available in our paper presented at IROS Other than of purely intellectual interest, this question is relevant to the medium-term challenges of robotics: We combined deep networks, local VIO, Kalman filtering, and optimal control.