Releases: mbari-org/annosaurus
annosaurus 0.6.1
This is a bug fix release. The ZeroMQ publisher was posting duplicate messages, at least in some circumstances. We now filter those out using observable.distinct
where the filter compares the annotations observationUuid
and observationTimestamp
to determine if a message is a duplicate.
annosaurus 0.6.0
Added ZeroMQ notifcations that publish when an annotation has been created or updated. This is targeted at MBARI's ship deployments where ML apps can listen for user input from VARS. To deploy, modify your application.conf
or pass the following environment parameters on start up to your Docker container:
- MESSAGING_ZEROMQ_PORT: Defines the port to publish messages to. Default is 5563
- MESSAGING_ZEROMQ_ENABLE: true/false. If you want ZMQ running set it to true. Default is false
- MESSAGING_ZEROMQ_TOPIC: The ZMQ topic to publish to. The default is "vars".
The messages will be JSON. For real-time operations, the minimum JSON will be formatted like so:
{
"observation_uuid": "49e0cd44-8915-4b6c-ad68-400ef3d41b06",
"observer": "brian",
"observation_timestamp": "2020-02-03T23:25:45.503055Z",
"concept": "Grimpoteuthis bathynectes",
"recorded_timestamp": "2020-02-03T23:25:45.503055Z",
"video_reference_uuid": "7cb320dc-0da5-4448-a26f-79177651735c",
"associations": [],
"image_references": []
}
Here's example ZMQ subscribers:
psenvsub.java
import org.zeromq.SocketType;
import org.zeromq.ZMQ;
import org.zeromq.ZMQ.Socket;
import org.zeromq.ZContext;
/**
* Pubsub envelope subscriber
*/
public class psenvsub {
int port = 5563; // MESSAGING_ZEROMQ_PORT:
String topic = "vars"; // MESSAGING_ZEROMQ_TOPIC
public static void main(String[] args) {
// Prepare our context and subscriber
try (ZContext context = new ZContext()) {
Socket subscriber = context.createSocket(SocketType.SUB);
subscriber.connect("tcp://localhost:" + port);
subscriber.subscribe(topic.getBytes(ZMQ.CHARSET));
while (!Thread.currentThread().isInterrupted()) {
// Read envelope with address
String address = subscriber.recvStr();
// Read message contents
String contents = subscriber.recvStr();
System.out.println(address + " : " + contents);
}
}
}
}
psenvsub.c
// Pubsub envelope subscriber
#include "zhelpers.h"
int main (void)
{
// Prepare our context and subscriber
void *context = zmq_ctx_new ();
void *subscriber = zmq_socket (context, ZMQ_SUB);
zmq_connect (subscriber, "tcp://localhost:5563");
zmq_setsockopt (subscriber, ZMQ_SUBSCRIBE, "vars", 1);
while (1) {
// Read envelope with address
char *address = s_recv (subscriber);
// Read message contents
char *contents = s_recv (subscriber);
printf ("[%s] %s\n", address, contents);
free (address);
free (contents);
}
// We never get here, but clean up anyhow
zmq_close (subscriber);
zmq_ctx_destroy (context);
return 0;
}
psenvsub.py
"""
Pubsub envelope subscriber
"""
import zmq
def main():
""" main method """
# Prepare our context and publisher
context = zmq.Context()
subscriber = context.socket(zmq.SUB)
subscriber.connect("tcp://localhost:5563")
subscriber.setsockopt(zmq.SUBSCRIBE, b"vars")
while True:
# Read envelope with address
[address, contents] = subscriber.recv_multipart()
print("[%s] %s" % (address, contents))
# We never get here but clean up anyhow
subscriber.close()
context.term()
if __name__ == "__main__":
main()
annosaurus 0.5.1
Improved bulk create/delete methods.
Create
Allows users to bulk insert annotations into annosaurus as JSON. We've tested this up to about 5000 annotations at once. Note that annosaurus will handle large bulk inserts, but be aware most clients will timeout while waiting for the insert to complete. (The results is that the annotations do get inserted though).
Endpoint
/anno/v1/annotations/bulk
Example
POST /anno/v1/annotations/bulk
Content-Type: application/json
Authorization: Bearer <jwt access token>
[
{
"observation_uuid": "77d71526-0f5a-4f5b-8f00-964454aaf53e",
"concept": "Pandalus platyceros",
"observer": "brian",
"observation_timestamp": "2017-08-22T20:38:48.219Z",
"video_reference_uuid": "b6be1826-7413-48cc-8b74-324477eed114",
"imaged_moment_uuid": "52cf5b45-427a-4d01-a175-e89387da419f",
"elapsed_time_millis": 0,
"associations": [],
"image_references": [
{
"uuid": "e9e7a01f-f675-4f08-9434-1eaf0f462c3d",
"description": "uncompressed image",
"url": "http://somserver.org/images/i2MAP_2020170622T173833Z-b6be1826.png",
"format": "image/png",
"last_updated_time": "2017-09-11T14:37:09Z"
},
{
"uuid": "ec4715f3-705f-4ece-b966-05c814ab6666",
"description": "uncompressed image",
"url": "http://somserver.org//images/i2MAP_2020170622T173833Z-b6be1826.png",
"format": "image/png",
"last_updated_time": "2017-09-12T15:21:52Z"
}
]
}
]
Returns
It returns the annotations, as json, after they are inserted. Note that you do not need to define any uuids, such as observation_uuid or uuid (other than the video_reference_uuid which indicates what video the annotations are associated with), however, the returned annotations will have the uuid keys generated by the database.
Delete
Deletes all annotations associated with a given video (defined by it's video_reference_uuid}.
Endpoint
/anno/v1/fast/videoreference/{video_reference_uuid}
Example
DELETE /anno/v1/fast/videoreference/b6be1826-7413-48cc-8b74-324477eed114
Authorization: Bearer <jwt access token>
Returns
Counts of all the objects deleted
{
"video_reference_uuid": "ef78c89e-079e-4e78-9645-56cfb2c23e3e",
"ancillary_data_count": 0,
"image_reference_count": 108,
"association_count": 10,
"observation_count": 500,
"imaged_moment_count": 499
}
annosaurus 0.5.0
We're working on adding methods to support exporting of datasets to be referenced by publications. Added an endpoint http://localhost:8082/anno/v1/fast/details/{link_name}/{link_value}
that allow finding annotations tagged with a specific detail/association. For example:
http://ione.mbari.org:8100/anno/v1/fast/details/comment/Nature20190609559
annosaurus 0.4.0
annosaurus 0.4.0
Add a /windowrequest
endpoint to find annotations in a set of videos near an existing one. The expected workflow to use this endpoint is as follows:
1. Fetch videos of interest
In general, these are videos from the same camera deployment. You can retrieve these from vampire-squid using http://myserver.org/vam/v1/media/concurrent/{video_reference_uuid}
. This will return an array of media objects that overlap with the one you provided (via it's video_reference_uuid). For example:
[
{
"video_sequence_uuid": "28d1c03e-723e-4d78-9705-9b973a768c66",
"video_reference_uuid": "e3005de7-2c0a-4a3f-80ad-79e5e79ee4b8",
"video_uuid": "db0c3d6a-9849-4ff4-bc18-fd134337d114",
"video_sequence_name": "Ventana 3009",
"camera_id": "Ventana",
"video_name": "V3009-01",
"uri": "urn:tid:mbari.org:V3009-01",
"start_timestamp": "2007-05-25T15:28:39Z",
"duration_millis": 3600000,
"container": "tape",
"width": 1920,
"height": 1080,
"frame_rate": 29.97,
"size_bytes": 0,
"description": "Tape loaded from VARS on 2019-01-09T16:30:23.971Z. 86/86 valid dates"
}
]
2. Use windowrequest
You will need:
- The video_reference_uuids of the videos you want to query from
- The imaged_moment_uuid of the existing annotation, it's
recordedTimestamp
will be the center of the time window - the window width in milliseconds
POST /anno/v1/imagedmoments/windowrequest
Content-Type: application/json
{
video_reference_uuids: [
"e3005de7-2c0a-4a3f-80ad-79e5e79ee4b8",
"9be0daeb-d303-46a9-b3fa-699d7d40591e"
],
imaged_moment_uuid: "0557b59b-60d5-44d0-acb3-71bf4acddf0a",
window: 10000
}
annosaurus 0.3.2
Added a fast endpoint for fetching images for a given video_reference_uuid. The endpoint is:
http://my.server.net/anno/v1/fast/images/videoreference/<video_reference_uuid>
It will return JSON image data structured likes so:
[
{
"image_reference_uuid": "63dfcfa7-ea7c-4809-8435-89d3bae4bd64",
"format": "image/jpg",
"width": 1920,
"height": 1080,
"url": "http://search.mbari.org/ARCHIVE/frameGrabs/Ventana/images/3812/00_50_10_19.jpg",
"description": "compressed image with overlay",
"video_reference_uuid": "c101ecb4-d22d-4f5c-aa9f-1d1048643086",
"imaged_moment_uuid": "57be066a-6498-4858-ad5f-07ed4254d8ea",
"timecode": "00:50:10:19",
"recorded_timestamp": "2014-11-04T17:05:31Z"
},
{
"image_reference_uuid": "3faed261-d002-418d-9921-fcf163a0191d",
"format": "image/png",
"width": 1920,
"height": 1080,
"url": "http://search.mbari.org/ARCHIVE/frameGrabs/Ventana/images/3812/00_50_10_19.png",
"description": "source image",
"video_reference_uuid": "c101ecb4-d22d-4f5c-aa9f-1d1048643086",
"imaged_moment_uuid": "57be066a-6498-4858-ad5f-07ed4254d8ea",
"timecode": "00:50:10:19",
"recorded_timestamp": "2014-11-04T17:05:31Z"
}
]
annosaurus 0.2.14
Added an endpoint for fast look up of imaged_moment_uuids
by concept with images:
Example:
GET http://localhost:8082/anno/v1/fast/imagedmoments/concept/images/Nanomia%20bijuga
This call will return a list of imaged_moment_uuids that contain both an observation matching the given concept and an image. Useful for machine learning tools looking for images matching a given term. The results will be JSON like:
[
"a6d073f5-2e8d-4d3b-a39f-0124aec5c1f7",
"f47645d5-f5a8-41f1-87d2-024f2560d7b9",
"ab7e53bc-dafe-4571-af65-0268be422b19",
"e49f67eb-ac45-462a-8aed-02865f68dbb8",
"137b6826-cd82-45ee-8f6e-0291769709f2",
"bcd8aaa0-64fa-4970-9d38-032b5bc9573d",
"636cb0c7-8fa9-4143-92ee-041ca70274e1",
"1fdee1e2-7a3a-48ec-824d-0441a9fcc8b2",
"3be58b2a-3fe6-4bc8-a078-04cf0d456ae8",
"aa6c6b47-175a-4655-bae9-0612dca91d94",
"2707eda5-7080-40e3-b85e-073541c087a1",
"e8a0b3a0-8932-489f-84d4-0794eb6bc4da",
"c1f853a3-297f-4652-9d01-07c5b2bdd616"
]
annosaurus 0.2.13
This is a bug fix addressing a minor issue introduced in 0.2.11. A call to /fast/concept/:concept?limit=xx
would not always return the correct number of rows. This has been fixed.
annosaurus 0.2.12
Added JavaMelody for monitoring. It's linked to the path /monitoring
. For example http://localhost:8082/anno/monitoring
annosaurus 0.2.11
Added fast endpoints for fetching only annotations with images.
Example queries
Description | URL Example |
---|---|
Find by concept with images | http://localhost:8082/anno/v1/fast/concept/images/Nanomia |
Find by concept with images with limit and offset | http://localhost:8082/anno/v1/fast/concept/images/Nanomia?limit=20&offset=50 |
Count by concept (includes annotations without images) | http://localhost:8082/anno/v1/observations/concept/count/Nanomia |
Count by concept with images | http://localhost:8082/anno/v1/observations/concept/images/count/Nanomia |
Retrieving counts
Counting with or without images returns the same data structure. It contains the concept you requested and the count for that concept.
{"concept":"Nanomia", "count":"62"}