Skip to content

Commit

Permalink
Merge pull request #159 from lsst-uk/develop
Browse files Browse the repository at this point in the history
Develop
  • Loading branch information
thespacedoctor authored Mar 14, 2023
2 parents 24d2417 + c365331 commit 0507635
Show file tree
Hide file tree
Showing 81 changed files with 540 additions and 380 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/source/_images/lasair_icon_transparent.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
11 changes: 10 additions & 1 deletion docs/source/concepts/querying.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,16 @@ ORDER BY glatmean
and I click 'Run this Query', then I get a list of objects that are coincident with
the given watchlist, together with the watchlist's name, and the galactiv latitude.
The `WHERE` clause has restricted the results by galactic latitude, and the results
come in order of galactic latittude.
come in order of galactic latitude.

## Multiple Tables

Lasair supports queries that join multiple tables, for example a watchlist of
your favourite sources, or the [TNS](https://www.wis-tns.org/) list of known
supernovae and other transients. In this case, you are selecting **ONLY** those
objects that are **ALSO* in the chosen table. If you make a filter that selects
```objectId``` and you also choose a watchlist, then your filter returns only alerts
coincident with the sources in the watchlist.

## Streaming Filter

Expand Down
18 changes: 0 additions & 18 deletions docs/source/concepts/sky-search.md

This file was deleted.

2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ def __getattr__(cls, name):
now = datetime.now()
now = now.strftime("%Y")
project = u'lasair'
copyright = u'%(now)s, The University of Edinburgh and Queen\'s University Belfast' % locals()
copyright = u'%(now)s, The University of Edinburgh and Queen\'s University Belfast. Funded by UK STFC grants ST/X001334/1 and ST/X001253/1' % locals()
version = "v" + str(__version__)
release = version
today_fmt = '%Y'
Expand Down
11 changes: 10 additions & 1 deletion docs/source/core_functions/alert-streams.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,13 +33,22 @@ You will need to understand two concepts: Topic and GroupID.
* The Topic is a string to identify which stream of alerts you want, which derives from the name of a Lasair streaming query.
* The GroupID tells Kafka where to start delivery to you. It is just a string that you can make up, for example "Susan3456". The Kafka server remembers which GroupIds it has seen before, and which was the last alert it delivered. When you start your code again with the same GroupID, you only get alerts that arrived since last time you used that GroupId. If you use a new GroupID, you get the alerts from the start of the Kafka cache, which is about 7 days.

For testing purposes, the `group_id` will change frequently, and you get all of the alerts
You can find the topic that corresponds to your filter in the detail page, shown here in the red oval:

<img src="../_images/alert-streams/seriousfilter.png" width="500px"/>

The topic name is a combination of the string "lasair_", the ID number of your user account, and
a sanitised version of the name you gave the filter. Therefore if you edit the filter and change its name,
the topic name will also change.

For testing purposes, the `group_id` will change frequently, and you can get all of the alerts
the come from the given stream. Then you will set up your program to run continuously,
perhaps in a `screen` session on a server machine, or started every hour by `cron`.
In this case, the `group_id` should remain constant, so you won't get any alerts twice.

Here is the sample code
```
import json
from lasair import lasair_consumer
kafka_server = 'kafka.lsst.ac.uk:9092'
Expand Down
26 changes: 19 additions & 7 deletions docs/source/core_functions/make_annotator.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,14 @@ Annotation means that external users push information to the Lasair database.
Therefore it requires that user to inform the Lasair team and be approved
before it will work. The team will use the admin interface to create an `annotator`
object in the database, which is a conneciton between the API token of that user
with the name (`topic`) assigned to the annotator. Here we see that admin interface:
with the name (`topic`) assigned to the annotator.

In the first case, write to
<a href="mailto:[email protected]?subject=New Annotator">Lasair Team</a>
to propose your annotator.

Here we see that admin interface that the Lasair team uses to make your annotator,
if it is approved.:

<img src="../_images/make_annotator/fastfinder.png" width="500px"/>

Expand All @@ -17,25 +24,29 @@ goes wild, the annotator can be switched off by setting `active=0`. Finally, the
can be `public` or not, meaning that it is visible or not to others building Lasair filters.

### Make the code
The following code reads the stream from a Lasair filter, and for each objectId,
The following code reads the stream from a Lasair filter, and for each `objectId`,
it pulls the complete object information so it could analyse the lightcurve
and other information before making a classification decision.
This information is collected up as the annotation and sent back to Lasair,
where it will be available for others to query.

The settings file that is needed with the code looks like:
* `TOPIC_IN`: The name of your streaming query as seen in the 'more info' button of your filter
There should be a file `settings.py` file to accomany the code below with these variables defined::

* `TOPIC_IN`: The name of your streaming query as seen in the filter detail, where it says "The filter is streamed via kafka with the topic name"
* `GROUP_ID`: Choose a new one evey run when testing; keep it constant for long-term running
* `API_TOKEN`: As found in 'My Profile' top right of the web page
* `TOPIC_OUT`: The name of your annotator as agreed with the Lasair team (above)

If the code below is not clear, it would be good for you to read about how
the (Lasair client)[rest-api.html] works.

For more information about what is returned as `objectInfo`, a complete example
is [shown here](ZTF23aabplmy.html).

For testing purposes, the `GROUP_ID` will change frequently, and you get all of the alerts
the come from the given stream. Then you will set up your annotator program to run continuously,
perhaps in a `screen` session on a server machine, or started every hour by `cron`.
In this case, the `GROUP_ID` will remain constant, so you won't het any alerts twice.
In that case, the `GROUP_ID` will remain constant, so you won't get any alerts twice.

A much simpler code is possible if for example the annotation is the classification
results from another broker. In that case, only the call to `L.annotator()` is necessary.
Expand Down Expand Up @@ -95,7 +106,7 @@ topic_out = settings.TOPIC_OUT
# just get a few to start
max_alert = 5
n_annotate = 0
n_alert = n_annotate = 0
while n_alert < max_alert:
msg = consumer.poll(timeout=20)
if msg is None:
Expand All @@ -105,7 +116,8 @@ while n_alert < max_alert:
break
jsonmsg = json.loads(msg.value())
objectId = jsonmsg['objectId']
n_alert += 1
n_annotate += handle_object(objectId, L, topic_out)
print('Annotated %d objects' % n_annotate)
print('Annotated %d of %d objects' % (n_annotate, n_alert))
```
9 changes: 5 additions & 4 deletions docs/source/core_functions/python-notebooks.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,12 @@



Below are some Google colab notebooks that use Lasair API, which has a token authentication scheme. The notebooks below have a "free sample" token that allows only 10 calls to the API in an hour. You can get a much better token by
Below are some Google colab notebooks that use Lasair API, which has a token authentication scheme.
The notebooks below have a fake token "xxxxxxx" that doesn't work. To use the notebooks, you will need to :

* get a Lasair account [here]({%lasairurl%}/signup)
* follow the instructions in the **[API documentation](core_functions/rest-api.html)
* then copy the notebook to your own Google account.
* Get a Lasair account [here]({%lasairurl%}/signup), and log in to the Lasair website.
* Click on your username at the top right and select "My Profile"
* Copy the notebook to your own Google account, and replace the xxxxxxx with your own token.

---

Expand Down
39 changes: 18 additions & 21 deletions docs/source/core_functions/sky-search.md
Original file line number Diff line number Diff line change
@@ -1,29 +1,26 @@
# Sky Search

## The Search Bar
At the top of every Lasair page is a form that can be filled in to do various kinds of
search. The following are supported:

<img src="../_images/search/searchbar.png" height="400px"/>
- Search by RA and Dec decimal degrees, delimited by space or comma:
```
308.590715 9.278195
308.590715, 9.278195
```

At the top of every Lasair page is a form that can be filled in to do a search
on the sky. You can enter several different kinds of search term, as illustrated
in the screenshot above:
- Search by RA and Dec sexagesimal coordinates, delimited by space or comma:
```
20:34:21.7 09:16:41.5
20:34:21.7, 09:16:41.5
```

- A Lasair object name
- Adding a search radius in arcseconds to any of the previous, for example:
```
308.590715 9.278195 60
```

- The name of an entry in the
<a href=https://www.wis-tns.org/>Transient Name Server (TNS)</a>, for example
<a href=https://www.wis-tns.org/search?name=SN2022ffg>SN2022ffg</a>
- Search by object identifier, for example ```ZTF23aacvrxx```

- The name of a host galaxy, that can be searched in the
<a href=https://ned.ipac.caltech.edu/>NED database</a>, for example
<a href=https://ned.ipac.caltech.edu/cgi-bin/objsearch?objname=CGCG093-074>CGCG093-074</a>.
- Search by TNS identifier, for example ```AT2020iry``` or ```SN2020iry``` or ```2020iry```.

- Right ascension and declination in J2000 frame:

- As decimal degrees, for example 141.15725 25.39508,

- As decimal degrees with a search radius in arcseconds, for example 141.15725 25.39508 10,

- As sexagesimal, for example 10:13:48.2 18:07:38.3

- As sexagesimal with a search radius in arcseconds, for example 10:13:48.2 18:07:38.3 10
15 changes: 11 additions & 4 deletions docs/source/index.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,18 @@
<img src="_images/lasair_logo_transparent.png" width=500>

# Lasair: A UK Alert Stream Broker
# [Lasair: A UK Alert Stream Broker]({%lasairurl%})

*Lasair* is being developed by The University of Edinburgh and Queen\'s University Belfast with the ultimate goal of serving transient alerts from the future [Rubin Legacy Survey of Space and Time (LSST)](https://www.lsst.org) to the astronomical community. *Lasair* (pronounced '*L-AH-s-uh-r*') means flame or flash in Scots and Irish gaelic. To prototype functionality needed to process transient event alerts from LSST, Lasair is currently processing and serving data from the public stream of the [Zwicky Transient Facility (ZTF)](http://www.ztf.caltech.edu/), which is releasing a transient alert stream in a format similar to that envisaged for LSST. We thank ZTF for access to this valuable public data stream.
[*Lasair*]({%lasairurl%}) is being developed by The University of Edinburgh, Queen\'s University Belfast,
and Oxford University with the ultimate goal of serving transient alerts from the
future [Rubin Legacy Survey of Space and Time (LSST)](https://www.lsst.org) to the
astronomical community. *Lasair* (pronounced '*L-AH-s-uh-r*') means flame or
flash in Scots and Irish gaelic. To prototype functionality needed to process transient
event alerts from LSST, Lasair is currently processing and serving data from the
public stream of the [Zwicky Transient Facility (ZTF)](http://www.ztf.caltech.edu/),
which is releasing a transient alert stream in a format similar to that
envisaged for LSST. We thank ZTF for access to this valuable public data stream.

If you make use of Lasair in any of your work, please remember to cite our paper: [Lasair: The Transient Alert Broker for LSST:UK](https://doi.org/10.3847/2515-5172/ab020f), K. W. Smith, R. D. Williams et. al., Research Notes AAS, **3**,26 (2019).
If you make use of [Lasair]({%lasairurl%}) in any of your work, please remember to cite our paper: [Lasair: The Transient Alert Broker for LSST:UK](https://doi.org/10.3847/2515-5172/ab020f), K. W. Smith, R. D. Williams et. al., Research Notes AAS, **3**,26 (2019).

To start using Lasair to filter the deluge of alerts:
```eval_rst
Expand Down Expand Up @@ -33,7 +41,6 @@ Please start here to discover the key ideas of Lasair:
concepts/objects_sources
concepts/lightcurve
concepts/sky-context
concepts/sky-search
concepts/querying
concepts/coding
concepts/added-value
Expand Down
20 changes: 13 additions & 7 deletions docs/source/more_info/faqs.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,5 @@
# Questions and Answers

**Table of Contents**

{{TOC}}

## What are Lasair-ZTF and Lasair-LSST?

See [ZTF and LSST](../about.html#ztf-and-lsst)
Expand Down Expand Up @@ -50,7 +46,7 @@ Choosing interesting alerts can be based on several criteria: The characteristic

## Why should I register on the Lasair website?

Registration is easy, and just requires a valid email (signup [here]({%lasairurl%}/signup)). You can then build and save queries, watchlists, and sky areas, convert those to real-time slert treams, and use the Lasair API.
Registration is easy, and just requires a valid email (signup [here]({%lasairurl%}/signup)). You can then build and save queries, watchlists, and watchmaps (sky areas), convert those to real-time slert treams, and use the Lasair API.

## Besides Lasair, what other websites carry astronommical transients?

Expand All @@ -61,9 +57,15 @@ There are seven community brokers that will receive and process LSST alerts in r
Lasair has been processing, storing, and distributing alerts from the ZTF survey since 2018.
Operation with LSST will start in 2023.

## Why are there no alerts on the Lasair front page?

The front page shows alerts from the last seven days. Sometimes no alerts have been received
in that time, and so none are shown. Reasons may be weather or equipment failure.
More information is available in the green news bar at the top of the front page.

## Can I get alerts from a particular region of the sky?

Lasair supports "sky areas", defined by a [MOC](https://cds-astro.github.io/mocpy/), that you build yourself.
Lasair supports "watchmaps", defined by a [MOC](https://cds-astro.github.io/mocpy/), that you build yourself.

## Can I get alerts associated with my favourite sources?

Expand Down Expand Up @@ -100,9 +102,13 @@ The Lasair client is described [here](../core_functions/rest-api.html).

This is explained [here](../concepts/objects_sources.html).

## How do search for an object by position in the sky?

This is called a "cone search". See next question.

## What is a cone-search and can Lasair do this?

A *cone* in this contect means a point in the sky with an angular tolerance -- the opening
A *cone* in this context means a point in the sky with an angular tolerance -- the opening
angle of the cone, as explained [here](../concepts/sky-search.html).
You can use the [Lasair Sky Search](../core_functions/sky-search.html)
to do this.
Expand Down
5 changes: 0 additions & 5 deletions services/externalBrokers/fink/junkfink_client_conf.py

This file was deleted.

14 changes: 8 additions & 6 deletions services/externalBrokers/fink/readme
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,14 @@ sudo apt-get update
sudo apt-get -y install python3-pip

# Install fink-client somewhere on your computer
cd /home/ubuntu/lasair-lsst/services/fink
git clone https://github.com/astrolabsoftware/fink-client
cd fink-client
pip3 install --upgrade pip setuptools wheel
pip3 install -r requirements.txt
cd ..
pip install fink-client

#cd /home/ubuntu/lasair-lsst/services/fink
#git clone https://github.com/astrolabsoftware/fink-client
#cd fink-client
#pip3 install --upgrade pip setuptools wheel
#pip3 install -r requirements.txt
#cd ..

# credentials
cp fink_client_conf.py fink-client/fink_client
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@
{% load static %}
{% block content %}


{% block title %}
Annotators
{% endblock title %}


<div class="container">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@


{% block title %}
{{data.objectId}}
Schema Browser
{% endblock title %}


Expand All @@ -30,6 +30,5 @@

</div>
</div>
<hr/>

{% endblock %}
11 changes: 7 additions & 4 deletions webserver/lasair/apps/db_schema/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,11 +78,14 @@ def get_schema_for_query_selected(
select = select.strip()
if " " not in select:
select = select.split(".")
if len(select) == 2 and select[0] in schemas.keys():
listName = []
if len(select) == 2 and select[0].lower() in [k.lower() for k in schemas.keys()]:
if select[1] == "*":
for k, v in schemas[select[0]].items():
tableSchema[k] = v
if select[0] in schemas.keys():
for k, v in schemas[select[0]].items():
tableSchema[k] = v
else:
tableSchema[select[1]] = schemas[select[0]][select[1]]
if select[0] in schemas.keys() and select[1] in schemas[select[0]].keys():
tableSchema[select[1]] = schemas[select[0]][select[1]]

return tableSchema
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@

</div>
</div>
<hr/>


{% endblock content %}

Expand Down
Loading

0 comments on commit 0507635

Please sign in to comment.