Before writing any code, it’s essential to carefully plan and document your task to ensure its relevance, feasibility, and compatibility with the OpenProblems framework.
Step 1: Check whether a similar task already exists
Please check the OpenProblems tasks to see whether a similar task has already been created.
This collaborative process will help ensure that your task is well-defined, relevant, and compatible with the OpenProblems framework. Additionally, it informs others of your ongoing work in this area and establishes a foundation for potential collaboration. Check out some examples by filtering on the ‘task’ label on GitHub.
Task motivation
Explain the motivation behind your proposed task. Describe the biological or computational problem you aim to address and why it’s important. Discuss the current state of research in this area and any gaps or challenges that your task could help address. This section should convince readers of the significance and relevance of your task.
Task description
Provide a clear and concise description of your task, detailing the specific problem it aims to solve. Outline the input data types, the expected output, and any assumptions or constraints. Be sure to explain any terminology or concepts that are essential for understanding the task.
Proposed ground-truth in datasets
Describe the datasets you plan to use for your task. OpenProblems offers a standard set of datasets (See “Common datasets”) which you can peruse through. Explain how these datasets will provide the ground-truth for evaluating the methods implemented in your task. If possible, include references or links to the datasets to facilitate reproducibility.
Initial set of methods to implement
List the initial set of methods you plan to implement for your task. Briefly describe each method’s core ideas and algorithms, and explain why you think they are suitable for your task. Consider including both established and cutting-edge methods to provide a comprehensive benchmarking of the state-of-the-art.
Proposed control methods
Outline the control methods you propose for your task. These methods serve as a starting point to test the relative accuracy of new methods in the task and as quality control for the defined metrics. Include both positive controls, which are methods with known outcomes resulting in the best possible metric values, and negative controls, which are simple, naive, or random methods that do not rely on sophisticated techniques or domain knowledge. Explain the rationale for your chosen controls.
Proposed Metrics
Describe the metrics you propose for evaluating the performance of methods in your task. Explain the rationale for selecting these metrics and how they will accurately assess the methods’ success in addressing the task’s challenges. Consider including multiple metrics to capture different aspects of method performance.
Step 3: Create task info
Now create a task info metadata file in the src/tasks/<task_id>/api directory. This file will be part of the automatic “readme” creation in the repo directory of this task. The task info file should be named task_info.yaml and should contain the following information:
# A unique identifier. Can only contain lowercase letters, numbers or underscores.name: ...# A unique, human-readable, short label. Used for creating summary tables and visualisations.label: ...# A one sentence summary of purpose and methodology. Used for creating an overview tables.summary: ...# The name of the image file to use for the component on the website.image: ...# A longer description (one or more paragraphs). Used for creating reference documentation and supplementary information.motivation: ...# A longer description (one or more paragraphs). Used for creating reference documentation and supplementary information.description: ...v1: # If this component was migrated from the OpenProblems v1 repository, this value # represents the location of the Python file relative to the root of the repository.path: ... # If this component was migrated from the OpenProblems v1 repository, this value # is the Git commit SHA of the v1 repository corresponding to when this component # was last updated.commit: ... # An optional note on any changes made during the migration.note: ...authors:- # Full name of the author, usually in the name of FirstName MiddleName LastName.name: ... # Additional information on the authorinfo:github: ...orcid: ...email: ...twitter: ...linkedin: ... # Role of the author. Possible values: # # * `"author"`: Authors who have made substantial contributions to the component. # * `"maintainer"`: The maintainer of the component. # * `"contributor"`: Authors who have made smaller contributions (such as code patches etc.).roles:[ ... ]
Next steps
You are now well-equipped to begin designing the API for the new benchmarking task.