piqa
Stay organized with collections
Save and categorize content based on your preferences.
Physical IQa: Physical Interaction QA, a new commonsense QA benchmark for naive
physics reasoning focusing on how we interact with everyday objects in everyday
situations. This dataset focuses on affordances of objects, i.e., what actions
each physical object affords (e.g., it is possible to use a shoe as a doorstop),
and what physical interactions a group of objects afford (e.g., it is possible
to place an apple on top of a book, but not the other way around). The dataset
requires reasoning about both the prototypical use of objects (e.g., shoes are
used for walking) and non-prototypical but practically plausible use of objects
(e.g., shoes can be used as a doorstop). The dataset includes 20,000 QA pairs
that are either multiple-choice or true/false questions.
Split |
Examples |
'train' |
16,113 |
'validation' |
1,838 |
FeaturesDict({
'goal': Text(shape=(), dtype=string),
'id': Text(shape=(), dtype=string),
'label': ClassLabel(shape=(), dtype=int64, num_classes=2),
'sol1': Text(shape=(), dtype=string),
'sol2': Text(shape=(), dtype=string),
})
Feature |
Class |
Shape |
Dtype |
Description |
|
FeaturesDict |
|
|
|
goal |
Text |
|
string |
|
id |
Text |
|
string |
|
label |
ClassLabel |
|
int64 |
|
sol1 |
Text |
|
string |
|
sol2 |
Text |
|
string |
|
@inproceedings{Bisk2020,
author = {Yonatan Bisk and Rowan Zellers and
Ronan Le Bras and Jianfeng Gao
and Yejin Choi},
title = {PIQA: Reasoning about Physical Commonsense in
Natural Language},
booktitle = {Thirty-Fourth AAAI Conference on
Artificial Intelligence},
year = {2020},
}
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2022-12-16 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2022-12-16 UTC."],[],[],null,["# piqa\n\n\u003cbr /\u003e\n\n- **Description**:\n\nPhysical IQa: Physical Interaction QA, a new commonsense QA benchmark for naive\nphysics reasoning focusing on how we interact with everyday objects in everyday\nsituations. This dataset focuses on affordances of objects, i.e., what actions\neach physical object affords (e.g., it is possible to use a shoe as a doorstop),\nand what physical interactions a group of objects afford (e.g., it is possible\nto place an apple on top of a book, but not the other way around). The dataset\nrequires reasoning about both the prototypical use of objects (e.g., shoes are\nused for walking) and non-prototypical but practically plausible use of objects\n(e.g., shoes can be used as a doorstop). The dataset includes 20,000 QA pairs\nthat are either multiple-choice or true/false questions.\n\n- **Additional Documentation** :\n [Explore on Papers With Code\n north_east](https://paperswithcode.com/dataset/piqa)\n\n- **Homepage** :\n \u003chttps://leaderboard.allenai.org/physicaliqa/submissions/get-started\u003e\n\n- **Source code** :\n [`tfds.datasets.piqa.Builder`](https://github.com/tensorflow/datasets/tree/master/tensorflow_datasets/datasets/piqa/piqa_dataset_builder.py)\n\n- **Versions**:\n\n - **`1.0.0`** (default): No release notes.\n- **Download size** : `1.74 MiB`\n\n- **Dataset size** : `5.92 MiB`\n\n- **Auto-cached**\n ([documentation](https://www.tensorflow.org/datasets/performances#auto-caching)):\n Yes\n\n- **Splits**:\n\n| Split | Examples |\n|----------------|----------|\n| `'train'` | 16,113 |\n| `'validation'` | 1,838 |\n\n- **Feature structure**:\n\n FeaturesDict({\n 'goal': Text(shape=(), dtype=string),\n 'id': Text(shape=(), dtype=string),\n 'label': ClassLabel(shape=(), dtype=int64, num_classes=2),\n 'sol1': Text(shape=(), dtype=string),\n 'sol2': Text(shape=(), dtype=string),\n })\n\n- **Feature documentation**:\n\n| Feature | Class | Shape | Dtype | Description |\n|---------|--------------|-------|--------|-------------|\n| | FeaturesDict | | | |\n| goal | Text | | string | |\n| id | Text | | string | |\n| label | ClassLabel | | int64 | |\n| sol1 | Text | | string | |\n| sol2 | Text | | string | |\n\n- **Supervised keys** (See\n [`as_supervised` doc](https://www.tensorflow.org/datasets/api_docs/python/tfds/load#args)):\n `None`\n\n- **Figure**\n ([tfds.show_examples](https://www.tensorflow.org/datasets/api_docs/python/tfds/visualization/show_examples)):\n Not supported.\n\n- **Examples**\n ([tfds.as_dataframe](https://www.tensorflow.org/datasets/api_docs/python/tfds/as_dataframe)):\n\nDisplay examples... \n\n- **Citation**:\n\n @inproceedings{Bisk2020,\n author = {Yonatan Bisk and Rowan Zellers and\n Ronan Le Bras and Jianfeng Gao\n and Yejin Choi},\n title = {PIQA: Reasoning about Physical Commonsense in\n Natural Language},\n booktitle = {Thirty-Fourth AAAI Conference on\n Artificial Intelligence},\n year = {2020},\n }"]]