GET /schema
The main endpoints are:
/v1/places/{place_id}?lang={lang}&type={type}&verbosity={verbosity}
to get the details of a place
(admin, street, address or POI).
type
: (optional) parameter belongs to the set {'admin', 'street', 'address', 'poi'}
verbosity
parameter belongs to the set {'long', 'short'}
. The default verbosity is long
./v1/places?bbox={bbox}&category=<category-name>&size={size}
to get a list of all points of interest matching the given bbox and categories
bbox
: left,bot,right,top e.g. bbox=2.0,48.0,3.0,49.0
category
: multiple values are accepted (e.g. category=leisure&category=museum
)size
: maximum number of places in the responseverbosity
: default verbosity is list
(equivalent to long
, except "information" and "wiki" blocks are not returned)source
: (optional) to force a data source (instead of automated selection based on coverage). Accepted values: osm
, pages_jaunes
q
: full-text query (optional, experimental)/v1/places?bbox={bbox}&raw_filter=class,subclass&size={size}
to get a list of all points of interest matching the given bbox (=left,bot,right,top e.g. bbox=2,48,3,49
) and the raw filters (e.g. raw_filter=*,restaurant&raw_filter=shop,*&raw_filter=bakery,bakery
)/v1/categories
to get the list of all the categories you can filter on./v1/pois/{poi_id}?lang={lang}
is the deprecated route to get the details of a POI./v1/directions
See directions.md for details/v1/events?bbox={bbox}&category=<category_name>&size={size}
to get a list of all events matching the given bbox and outing_category
bbox
: left,bot,right,top e.g. bbox=2.0,48.0,3.0,49.0
category
: one value is accepted (e.g. category=concert | show | exhibition | sport | entertainment
)size
: maximum number of events in the response/v1/status
to get the status of the API and associated ES cluster./v1/metrics
to get some metrics on the API that give statistics on the number of requests received, the duration of requests... This endpoint can be scraped by Prometheus.Create the virtualenv and install dependencies:
pipenv install
and then:
IDUNN_MIMIR_ES=<url_to_MIMIR_ES> IDUNN_WIKI_ES=<url_to_WIKI_ES> pipenv run python app.py
you can query the API on port 5000:
curl localhost:5000/v1/places/toto?lang=fr&type=poi
The configuration can be given from different ways:
utils/default_settings.yaml
IDUNN_CONFIG_FILE
(the default settings is still loaded and overriden)Please note that you will need an API key from openweathermap in order to use the Weather
block. You can then set it into the IDUNN_WEATHER_API_KEY
environment variable or directly into the WEATHER_API_KEY
inside the utils/default_settings.yaml
file.
Idunn comes along with all necessary components to contribute as easily as possible: specifically you don't need to have any Elasticsearch instance running. Idunn uses docker images to simulate the Elasticsearch sources and the Redis. This means that you will need a local docker install to be able to spawn an ES cluster.
To contribute the common workflow is:
pipenv install --dev
./tests
for the new feature you proposepipenv run pytest -vv -x
pipenv run black --diff --check
You can run it with both Redis and elasticsearch using docker. First, edit the docker-compose.yml
file to add a link to your elasticsearch instance (for example: https://somewhere.lost/
) in IDUNN_MIMIR_ES
.
Then you just need to run:
$ docker-compose up --build
If you need to clean the Redis cache, run:
$ docker-compose kill
$ docker image prune --filter "label=idunn_idunn"