Software & Data Downloads — CAZSL

Context-Aware Zero Shot Learning for learning a model that can generalize to different parameters or features of the interacting objects.

Learning accurate models of the physical world is required for a lot of robotic manipulation tasks. However, during manipulation, robots are expected to interact with unknown workpieces so that building predictive models which can generalize over a number of these objects is highly desirable. We provide codes for context-aware zero shot learning (CAZSL) models, an approach utilizing a Siamese network architecture, embedding space masking and regularization based on context variables which allows us to learn a model that can generalize to different parameters or features of the interacting objects. The proposed learning algorithm on the recently released Omnipush data set that allows testing of meta-learning capabilities using low-dimensional data. The codes allow comparison of the proposed method with several other baseline techniques. The proposed method will be presented at IROS2020.

  •  Zhang, W., Seto, S., Jha, D.K., "CAZSL: Zero-Shot Regression for Pushing Models by Generalizing Through Context", IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), November 2020.
    BibTeX TR2020-140 PDF Software
    • @inproceedings{Zhang2020nov,
    • author = {Zhang, Wenyu and Seto, Skyler and Jha, Devesh K.},
    • title = {CAZSL: Zero-Shot Regression for Pushing Models by Generalizing Through Context},
    • booktitle = {IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
    • year = 2020,
    • month = nov,
    • url = {https://www.merl.com/publications/TR2020-140}
    • }

Access software at https://github.com/merlresearch/CAZSL.