Ontology does not contain name

I ran the python tutorial audio.ipynb then got an error:

The full error is as following:

/tmp/ipython-input-2185389699.py:9: DeprecationWarning: The method wait_until_done for AnnotationImport is deprecated and will be removed in the next major release. Use the wait_till_done method instead.
  upload_job.wait_until_done()

Errors: [{'uuid': 'a67f19b4-c188-4300-b832-18f4223114d6', 'dataRow': {'id': None, 'globalKey': None}, 'status': 'FAILURE', 'errors': [{'name': 'ValidationError', 'message': 'Ontology does not contain name: text_audio.', 'additionalInfo': None}]}, {'uuid': '0268518f-f5d2-4dee-acd2-cd97c38a4d01', 'dataRow': {'id': None, 'globalKey': None}, 'status': 'FAILURE', 'errors': [{'name': 'ValidationError', 'message': 'Ontology does not contain name: checklist_audio.', 'additionalInfo': None}]}, {'uuid': '89db0530-a4d7-441b-98d3-c2c5b1576b5b', 'dataRow': {'id': None, 'globalKey': None}, 'status': 'FAILURE', 'errors': [{'name': 'ValidationError', 'message': 'Ontology does not contain name: radio_audio.', 'additionalInfo': None}]}]
Status of uploads:  [{'uuid': 'a67f19b4-c188-4300-b832-18f4223114d6', 'dataRow': {'id': None, 'globalKey': None}, 'status': 'FAILURE', 'errors': [{'name': 'ValidationError', 'message': 'Ontology does not contain name: text_audio.', 'additionalInfo': None}]}, {'uuid': '0268518f-f5d2-4dee-acd2-cd97c38a4d01', 'dataRow': {'id': None, 'globalKey': None}, 'status': 'FAILURE', 'errors': [{'name': 'ValidationError', 'message': 'Ontology does not contain name: checklist_audio.', 'additionalInfo': None}]}, {'uuid': '89db0530-a4d7-441b-98d3-c2c5b1576b5b', 'dataRow': {'id': None, 'globalKey': None}, 'status': 'FAILURE', 'errors': [{'name': 'ValidationError', 'message': 'Ontology does not contain name: radio_audio.', 'additionalInfo': None}]}]

Update:
Sry for miss full error message and the full code is here.

Could you provide the full code and not only the end? I am aware you have sent this - Ontology can not find the match name · Issue #2020 · Labelbox/labelbox-python · GitHub but it would be better to get the project id you are trying to send your predictions to.

actually, here is the code I used, it worked.

import labelbox as lb
import labelbox.data.annotation_types as lb_types
import uuid

API_KEY = ""
client = lb.Client(api_key=API_KEY)

project = client.get_project("cmgu0drmr005807t54k3p8epp")

text_annotation_1 = lb_types.ClassificationAnnotation(
    name="text_audio",
    value=lb_types.Text(answer="some sample text")
)
checklist_annotation_1 = lb_types.ClassificationAnnotation(
    name="checklist_audio",
    value=lb_types.Checklist(answer=[
        lb_types.ClassificationAnswer(name="first_checklist_answer"),
        lb_types.ClassificationAnswer(name="second_checklist_answer")
    ])  
)
radio_annotation_1 = lb_types.ClassificationAnnotation(
    name="radio_audio",
    value=lb_types.Radio(answer=lb_types.ClassificationAnswer(
        name="first_radio_answer")
    )
)

annotations = [
    text_annotation_1,
    checklist_annotation_1,
    radio_annotation_1
]

labels = [
    lb_types.Label(
        data={
          "global_key": "<YOUR_GK>"
        },
        annotations=annotations,
    )
]

# Upload MAL label for this data row in project
upload_job = project.MALPredictionImport.create_from_objects(
  client = client,
  project_id = "cmgu0drmr005807t54k3p8epp",
  name = "mal_job"+str(uuid.uuid4()),
  predictions = labels)

upload_job.wait_till_done()

print("Errors:", upload_job.errors)

In the project section you have a tab Import labels, there are sample code you can use for it has already all the schema linked to your ontology.

Hi PT,

Thanks for reply.

I tried this scripts, it works. But is there any to batch upload the data and annotation? But the code in the colab still doesn’t work from. Would you like to check that code?

This would be a sequence script, since we need to process the data, create a project, make or link an ontology.
You can use wait_till_done() at any point and wait for a null response (which often indicate success).

Hi, thanks for following up.

I finally fixed it. I built up the project & ontology, and use batch loaded the dataset, then it worked.

1 Like