diff --git a/README.md b/README.md index 1425001..d3b334e 100644 --- a/README.md +++ b/README.md @@ -1,13 +1,15 @@ -[![hacs_badge](https://img.shields.io/badge/HACS-Default-orange.svg)](https://github.com/custom-components/hacs) +![GitHub release (with filter)](https://img.shields.io/github/v/release/vingerha/gtfs2) ![GitHub](https://img.shields.io/github/license/vingerha/gtfs2) [![hacs_badge](https://img.shields.io/badge/HACS-Default-orange.svg)](https://github.com/custom-components/hacs) -# GTFS2 -This is an adaptation of the GTFS integration in HA Core, enhancements: +# GTFS2 for Static and RealTime +This is an adaptation of the GTFS integration in HomeAssistant Core, enhancements: - configuration via the GUI, no configuration.yaml needed - Uses selected route to further select start/end stops - Shows next 10 departures on the same stretch start/end , including alternative transport lines if applicable -- allows to load/update/delete datasources in gtfs2 folder -- added a sservice to update the GTFS datasource, e.g. calling the service via automation -- translations: at present only English and French +- allows to load/update/delete datasources in gtfs2 folder from the GUI +- Option to add gtfs realtime source/url +- Option to add gtfs realtime vehicle location source/url, generates geojson file which can be used for tracking vehicle on map card +- A service to update the GTFS datasource, e.g. for calling the service via automation +- translations: English and French ## Difference with GTFS HA core (outside of GUI setup) Core GTFS uses start + stop, it then determines every option between them and provides the next best option, regardless of the line/route @@ -17,6 +19,12 @@ Core GTFS uses start + stop, it then determines every option between them and pr ***Solution/workaround in GTFS2***: attribute added: next_departure_line shows all next departues with their line/means-of-transport. So even if you select a route first and then two stops, the attibutes will still show alternatives between those 2 stops, if applicable. ## Updates + +20231126 +- realtime vehile tracking with geojson output +- workflow tweaks +- extend update service call +- increase stability with reboots, loss of data(source) 20231110: adding features: - new attribute: next_departure_headsigns - adding route shortname in selection/list to overcome data discrepancies been short name and long name @@ -25,20 +33,18 @@ Core GTFS uses start + stop, it then determines every option between them and pr 20231104: initial version -## ToDo's / In Development -- Issue when updating the source db, it throws a db locked error. This when an existing entity for the same db starts polling it at the same time -- (DONE) Icon for the integration (brands) -- bypass setup control for routes that have no trips 'today'. The configuration does a spot-check if start/end actually return data with the idea to validate the setup. However, this only checks for 'today' so if your route actually has no transport running at the day of setup (say Sunday or Holiday) then it will reject it. -- (in DEV release) adding real-time data for providers that offer these for the same gtfs data: initally time and lat/long + +## ToDo's / In Development / Known Issues +- Issue when updating the source db: pygtfs error: at the moment unclear as errors fluctuate, posisbly a lack of resources (mem/cpu) +- get realtime data for sources that donot base on routes, e.g. France's TER realtime source only uses trip_id ## Installation via HACS : -In HACS, select the 3-dots and then custom repositories -Add : +1. In HACS, select the 3-dots and then custom repositories, add : - URL : https://github.com/vingerha/gtfs2 - Category : Integration -In Settings > Devices & Sevices +2. In Settings > Devices & Sevices - add the integration, note that this is GTFS2 ## Configuration @@ -46,22 +52,28 @@ Use the workflow Example: https://github.com/vingerha/gtfs2/blob/main/example.md -**IMPORTANT** +## Real Time vehicle tracking + +As per v1.6, the vehicle tracking output coordinates to geojson file in your www folder, which in turn can then be consumed by the geosjon integration and map card https://www.home-assistant.io/integrations/geo_json_events/ +![image](https://github.com/vingerha/gtfs2/assets/44190435/a3cbea60-46f1-40e9-88c5-4b9a0519c782) + +## **IMPORTANT** +- sources need to adhere to GTFS standards both for static data (zip/sqlite) as well as for real-time data (binary). - certain providers publish large zip-files which in turn will result in much larger db files. Unpacking may take a long time (depending HA server perf.). Example for a 117Mb zip: ~2hrs to unpack to a 7Gb sqlite -- for these large db, performance may be slow too, there is a PR to improve this by adding indexes to the stop_times table - the integration uses folder /config/gtfs2 to store the datafiles (zip and sqlite) +- the integration uses folder /config/www for geojson files, only available when using verhical tracking sources ## Data add / update Data can be updated at your own discretion by a service, e.g. you can have a weekly automation to run the service **Note:** for "update" to work, the name should be the ***same*** as the existing source. It will first remove the existing one and reload the one as per your URL -![image](https://github.com/vingerha/gtfs2/assets/44190435/2defc23d-a1a0-40be-b610-6c5360fbd464) +![image](https://github.com/vingerha/gtfs2/assets/44190435/2d639afa-376b-4956-8223-2c982dc537cb) or via yaml -![image](https://github.com/vingerha/gtfs2/assets/44190435/2fea7926-a64d-43b6-a653-c95f1f01c66d) - +![image](https://github.com/vingerha/gtfs2/assets/44190435/0d50bb87-c081-4cd6-8dc5-9603a44c21a4) +======= ## Known issues/challenges with source data Static gtfs: @@ -78,7 +90,5 @@ Realtime gtfs ## Thank you - @joostlek ... massive thanks to help me through many (!) tech aspects and getting this to the inital version - @mxbssn for initiating, bringing ideas, helping with testing -- @mark1foley for his gtfs real time integration which I managed to alter/integrate - - +- @mark1foley for his gtfs real time integration which was enhanced with its integration in GTFS2 diff --git a/custom_components/gtfs2/__init__.py b/custom_components/gtfs2/__init__.py index fa688c5..de1aeeb 100644 --- a/custom_components/gtfs2/__init__.py +++ b/custom_components/gtfs2/__init__.py @@ -8,7 +8,9 @@ from datetime import timedelta from .const import DOMAIN, PLATFORMS, DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL + from .coordinator import GTFSUpdateCoordinator, GTFSRealtimeUpdateCoordinator + import voluptuous as vol from .gtfs_helper import get_gtfs @@ -16,18 +18,66 @@ async def async_migrate_entry(hass, config_entry: ConfigEntry) -> bool: """Migrate old entry.""" - _LOGGER.debug("Migrating from version %s", config_entry.version) - if config_entry.version == 1: + _LOGGER.warning("Migrating from version %s", config_entry.version) - new = {**config_entry.data} - new['extract_from'] = 'url' - new.pop('refresh_interval') - - config_entry.version = 2 - hass.config_entries.async_update_entry(config_entry, data=new) + if config_entry.version == 1: - _LOGGER.debug("Migration to version %s successful", config_entry.version) + new_data = {**config_entry.data} + new_data['extract_from'] = 'url' + new_data.pop('refresh_interval') + + new_options = {**config_entry.options} + new_options['real_time'] = False + new_options['refresh_interval'] = 15 + new_options['api_key'] = "" + new_options['x_api_key'] = "" + new_options['offset'] = 0 + new_data.pop('offset') + + config_entry.version = 5 + hass.config_entries.async_update_entry(config_entry, data=new_data) + hass.config_entries.async_update_entry(config_entry, options=new_options) + + if config_entry.version == 2: + + new_options = {**config_entry.options} + new_data = {**config_entry.data} + new_options['real_time'] = False + new_options['api_key'] = "" + new_options['x_api_key'] = "" + new_options['offset'] = 0 + new_data.pop('offset') + + config_entry.version = 5 + hass.config_entries.async_update_entry(config_entry, options=new_options) + hass.config_entries.async_update_entry(config_entry, data=new_data) + + if config_entry.version == 3: + + new_options = {**config_entry.options} + new_data = {**config_entry.data} + new_options['api_key'] = "" + new_options['x_api_key'] = "" + new_options['offset'] = 0 + new_data.pop('offset') + + config_entry.version = 5 + hass.config_entries.async_update_entry(config_entry, options=new_options) + hass.config_entries.async_update_entry(config_entry, data=new_data) + + if config_entry.version == 4: + + new_options = {**config_entry.options} + new_data = {**config_entry.data} + new_options['offset'] = 0 + new_data.pop('offset') + + config_entry.version = 5 + hass.config_entries.async_update_entry(config_entry, data=new_data) + hass.config_entries.async_update_entry(config_entry, options=new_options) + + _LOGGER.warning("Migration to version %s successful", config_entry.version) return True @@ -40,6 +90,10 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: #await coordinator.async_config_entry_first_refresh() + + if not coordinator.last_update_success: + raise ConfigEntryNotReady + hass.data[DOMAIN][entry.entry_id] = { "coordinator": coordinator, } @@ -65,7 +119,7 @@ def setup(hass, config): def update_gtfs(call): """My GTFS service.""" _LOGGER.debug("Updating GTFS with: %s", call.data) - get_gtfs(hass, DEFAULT_PATH, call.data["name"], call.data["url"], True) + get_gtfs(hass, DEFAULT_PATH, call.data, True) return True hass.services.register( @@ -74,6 +128,5 @@ def update_gtfs(call): async def update_listener(hass: HomeAssistant, entry: ConfigEntry): """Handle options update.""" - hass.data[DOMAIN][entry.entry_id]['coordinator'].update_interval = timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)) - + hass.data[DOMAIN][entry.entry_id]['coordinator'].update_interval = timedelta(minutes=1) return True \ No newline at end of file diff --git a/custom_components/gtfs2/config_flow.py b/custom_components/gtfs2/config_flow.py index aeef210..1bea2bc 100644 --- a/custom_components/gtfs2/config_flow.py +++ b/custom_components/gtfs2/config_flow.py @@ -10,7 +10,19 @@ import homeassistant.helpers.config_validation as cv from homeassistant.core import HomeAssistant, callback -from .const import DEFAULT_PATH, DOMAIN, DEFAULT_REFRESH_INTERVAL +from homeassistant.helpers import selector + +from .const import ( + DEFAULT_PATH, + DOMAIN, + DEFAULT_REFRESH_INTERVAL, + DEFAULT_OFFSET, + CONF_API_KEY, + CONF_X_API_KEY, + CONF_VEHICLE_POSITION_URL, + CONF_TRIP_UPDATE_URL +) + from .gtfs_helper import ( get_gtfs, @@ -34,7 +46,8 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN): """Handle a config flow for GTFS.""" - VERSION = 2 + VERSION = 5 + def __init__(self) -> None: """Init ConfigFlow.""" @@ -76,27 +89,29 @@ async def async_step_user(self, user_input: dict | None = None) -> FlowResult: async def async_step_source(self, user_input: dict | None = None) -> FlowResult: """Handle a flow initialized by the user.""" errors: dict[str, str] = {} - if user_input is not None: - check_data = await self._check_data(user_input) - if check_data: - errors["base"] = check_data - return self.async_abort(reason=check_data) - else: - self._user_inputs.update(user_input) - _LOGGER.debug(f"UserInputs Data: {self._user_inputs}") - return await self.async_step_route() - return self.async_show_form( - step_id="source", - data_schema=vol.Schema( - { - vol.Required("file"): str, - vol.Required("extract_from"): vol.In({"zip": "Use gtfs2/zipfile with above name, without extension", "url": "Use URL below, leave 'na' if using zip"}), - vol.Required("url", default="na"): str, - }, - ), - errors=errors, - ) + if user_input is None: + return self.async_show_form( + step_id="source", + data_schema=vol.Schema( + { + vol.Required("extract_from"): selector.SelectSelector(selector.SelectSelectorConfig(options=["zip", "url"], translation_key="extract_from")), + vol.Required("file"): str, + vol.Required("url", default="na"): str, + }, + ), + errors=errors, + ) + check_data = await self._check_data(user_input) + _LOGGER.debug("Source check data: %s", check_data) + if check_data : + errors["base"] = check_data + return self.async_abort(reason=check_data) + else: + self._user_inputs.update(user_input) + _LOGGER.debug(f"UserInputs Data: {self._user_inputs}") + return await self.async_step_route() + async def async_step_remove(self, user_input: dict | None = None) -> FlowResult: """Handle a flow initialized by the user.""" @@ -114,32 +129,37 @@ async def async_step_remove(self, user_input: dict | None = None) -> FlowResult: ) try: removed = remove_datasource(self.hass, DEFAULT_PATH, user_input["file"]) - _LOGGER.debug(f"removed value: {removed}") + _LOGGER.debug(f"Removed gtfs data source: {removed}") except Exception as ex: - _LOGGER.info("Error while deleting : %s", {ex}) + _LOGGER.error("Error while deleting : %s", {ex}) return "generic_failure" return self.async_abort(reason="files_deleted") async def async_step_route(self, user_input: dict | None = None) -> FlowResult: """Handle the route.""" + errors: dict[str, str] = {} + check_data = await self._check_data(self._user_inputs) + _LOGGER.debug("Source check data: %s", check_data) + if check_data : + errors["base"] = check_data + return self.async_abort(reason=check_data) self._pygtfs = get_gtfs( self.hass, DEFAULT_PATH, self._user_inputs, False, ) - errors: dict[str, str] = {} + if user_input is None: return self.async_show_form( step_id="route", data_schema=vol.Schema( { vol.Required("route"): vol.In(get_route_list(self._pygtfs)), - vol.Required("direction"): vol.In( - {"0": "Outward", "1": "Return"} - ), + vol.Required("direction"): selector.SelectSelector(selector.SelectSelectorConfig(options=["0", "1"], translation_key="direction")), }, ), + errors=errors, ) self._user_inputs.update(user_input) _LOGGER.debug(f"UserInputs Route: {self._user_inputs}") @@ -165,10 +185,8 @@ async def async_step_stops(self, user_input: dict | None = None) -> FlowResult: vol.Required("origin"): vol.In(stops), vol.Required("destination", default=last_stop): vol.In(stops), vol.Required("name"): str, - vol.Optional("offset", default=0): int, - vol.Required("include_tomorrow"): vol.In( - {False: "No", True: "Yes"} - ), + vol.Optional("include_tomorrow", default = False): selector.BooleanSelector(), + }, ), errors=errors, @@ -189,7 +207,8 @@ async def _check_data(self, data): self._pygtfs = await self.hass.async_add_executor_job( get_gtfs, self.hass, DEFAULT_PATH, data, False ) - if self._pygtfs == "no_data_file" or "no_zip_file": + _LOGGER.debug("Checkdata: %s ", self._pygtfs) + if self._pygtfs in ['no_data_file', 'no_zip_file', 'extracting'] : return self._pygtfs return None @@ -203,11 +222,12 @@ async def _check_config(self, data): "schedule": self._pygtfs, "origin": data["origin"].split(": ")[0], "destination": data["destination"].split(": ")[0], - "offset": data["offset"], - "include_tomorrow": data["include_tomorrow"], + "offset": 0, + "include_tomorrow": True, "gtfs_dir": DEFAULT_PATH, "name": data["name"], "next_departure": None, + "file": data["file"], } try: @@ -215,7 +235,7 @@ async def _check_config(self, data): get_next_departure, self ) except Exception as ex: # pylint: disable=broad-except - _LOGGER.info( + _LOGGER.error( "Config: error getting gtfs data from generic helper: %s", {ex}, exc_info=1, @@ -246,24 +266,25 @@ async def async_step_init( ) -> FlowResult: """Manage the options.""" if user_input is not None: - user_input['real_time'] = False + if user_input['real_time']: self._user_inputs.update(user_input) - _LOGGER.debug(f"GTFS Options with realtime: {self._user_inputs}") return await self.async_step_real_time() else: self._user_inputs.update(user_input) _LOGGER.debug(f"GTFS Options without realtime: {self._user_inputs}") return self.async_create_entry(title="", data=self._user_inputs) - - return self.async_show_form( - step_id="init", - data_schema=vol.Schema( - { + + opt1_schema = { vol.Optional("refresh_interval", default=self.config_entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)): int, -# vol.Required("real_time"): vol.In({False: "No", True: "Yes"}), + vol.Optional("offset", default=self.config_entry.options.get("offset", DEFAULT_OFFSET)): int, + vol.Optional("real_time", default=self.config_entry.options.get("real_time")): selector.BooleanSelector() } - ), + + + return self.async_show_form( + step_id="init", + data_schema=vol.Schema(opt1_schema) ) async def async_step_real_time( @@ -273,14 +294,17 @@ async def async_step_real_time( errors: dict[str, str] = {} if user_input is not None: self._user_inputs.update(user_input) + _LOGGER.debug(f"GTFS Options with realtime: {self._user_inputs}") return self.async_create_entry(title="", data=self._user_inputs) return self.async_show_form( step_id="real_time", data_schema=vol.Schema( { - vol.Required("trip_update_url"): str, - vol.Required("vehicle_position_url"): str, + vol.Required(CONF_TRIP_UPDATE_URL, default=self.config_entry.options.get(CONF_TRIP_UPDATE_URL)): str, + vol.Optional(CONF_VEHICLE_POSITION_URL, default=self.config_entry.options.get(CONF_VEHICLE_POSITION_URL,"")): str, + vol.Optional(CONF_API_KEY, default=self.config_entry.options.get(CONF_API_KEY, "na")): str, + vol.Optional(CONF_X_API_KEY,default=self.config_entry.options.get(CONF_X_API_KEY, "na")): str }, ), errors=errors, diff --git a/custom_components/gtfs2/const.py b/custom_components/gtfs2/const.py index 865b814..50f0ebd 100644 --- a/custom_components/gtfs2/const.py +++ b/custom_components/gtfs2/const.py @@ -5,9 +5,11 @@ # default values for options DEFAULT_REFRESH_INTERVAL = 15 +DEFAULT_OFFSET = 0 DEFAULT_NAME = "GTFS Sensor2" DEFAULT_PATH = "gtfs2" +DEFAULT_PATH_GEOJSON = "www" CONF_DATA = "data" CONF_DESTINATION = "destination" @@ -24,6 +26,7 @@ ATTR_DROP_OFF_DESTINATION = "destination_stop_drop_off_type_state" ATTR_DROP_OFF_ORIGIN = "origin_stop_drop_off_type_state" ATTR_INFO = "info" +ATTR_INFO_RT = "info_realtime" ATTR_OFFSET = CONF_OFFSET ATTR_LAST = "last" ATTR_LOCATION_DESTINATION = "destination_station_location_type_name" @@ -236,3 +239,38 @@ WHEELCHAIR_BOARDING_DEFAULT = STATE_UNKNOWN WHEELCHAIR_BOARDING_OPTIONS = {1: True, 2: False} + +#gtfs_rt +ATTR_STOP_ID = "Stop ID" +ATTR_ROUTE = "Route" +ATTR_TRIP = "Trip" +ATTR_DIRECTION_ID = "Direction ID" +ATTR_DUE_IN = "Due in" +ATTR_DUE_AT = "Due at" +ATTR_NEXT_UP = "Next Service" +ATTR_ICON = "Icon" +ATTR_UNIT_OF_MEASUREMENT = "unit_of_measurement" +ATTR_DEVICE_CLASS = "device_class" +ATTR_LATITUDE = "latitude" +ATTR_LONGITUDE = "longitude" +ATTR_RT_UPDATED_AT = "gtfs_rt_updated_at" + +CONF_API_KEY = "api_key" +CONF_X_API_KEY = "x_api_key" +CONF_STOP_ID = "stopid" +CONF_ROUTE = "route" +CONF_DIRECTION_ID = "directionid" +CONF_DEPARTURES = "departures" +CONF_TRIP_UPDATE_URL = "trip_update_url" +CONF_VEHICLE_POSITION_URL = "vehicle_position_url" +CONF_ROUTE_DELIMITER = "route_delimiter" +CONF_ICON = "icon" +CONF_SERVICE_TYPE = "service_type" +CONF_RELATIVE_TIME = "show_relative_time" + +DEFAULT_SERVICE = "Service" +DEFAULT_ICON = "mdi:bus" +DEFAULT_DIRECTION = "0" + +TIME_STR_FORMAT = "%H:%M" + diff --git a/custom_components/gtfs2/coordinator.py b/custom_components/gtfs2/coordinator.py index c04f29a..bd82adb 100644 --- a/custom_components/gtfs2/coordinator.py +++ b/custom_components/gtfs2/coordinator.py @@ -1,16 +1,29 @@ """Data Update coordinator for the GTFS integration.""" from __future__ import annotations +import datetime from datetime import timedelta import logging + from homeassistant.config_entries import ConfigEntry from homeassistant.core import HomeAssistant from homeassistant.helpers.update_coordinator import DataUpdateCoordinator +import homeassistant.util.dt as dt_util + +from .const import ( + DEFAULT_PATH, + DEFAULT_REFRESH_INTERVAL, + CONF_API_KEY, + CONF_X_API_KEY, + ATTR_DUE_IN, + ATTR_LATITUDE, + ATTR_LONGITUDE, + ATTR_RT_UPDATED_AT +) +from .gtfs_helper import get_gtfs, get_next_departure, check_datasource_index, create_trip_geojson, check_extracting +from .gtfs_rt_helper import get_rt_route_statuses, get_rt_trip_statuses, get_next_services -from .const import DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL -from .gtfs_helper import get_gtfs, get_next_departure, check_datasource_index -from .gtfs_rt_helper import get_rt_route_statuses, get_next_services _LOGGER = logging.getLogger(__name__) @@ -26,7 +39,7 @@ def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: hass=hass, logger=_LOGGER, name=entry.entry_id, - update_interval=timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)), + update_interval=timedelta(minutes=1), ) self.config_entry = entry self.hass = hass @@ -35,87 +48,99 @@ def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: self._data: dict[str, str] = {} async def _async_update_data(self) -> dict[str, str]: - """Update.""" + """Get the latest data from GTFS and GTFS relatime, depending refresh interval""" data = self.config_entry.data options = self.config_entry.options + previous_data = None if self.data is None else self.data.copy() + _LOGGER.debug("Previous data: %s", previous_data) + self._pygtfs = get_gtfs( self.hass, DEFAULT_PATH, data, False - ) + ) self._data = { "schedule": self._pygtfs, "origin": data["origin"].split(": ")[0], "destination": data["destination"].split(": ")[0], - "offset": data["offset"], + "offset": options["offset"] if "offset" in options else 0, "include_tomorrow": data["include_tomorrow"], "gtfs_dir": DEFAULT_PATH, "name": data["name"], - } + "file": data["file"], + "extracting": False, + "next_departure": {}, + "next_departure_realtime_attr": {} + } + + if check_extracting(self): + _LOGGER.warning("Cannot update this sensor as still unpacking: %s", self._data["file"]) + previous_data["extracting"] = True + return previous_data + + # determinestatic + rt or only static (refresh schedule depending) + #1. sensor exists with data but refresh interval not yet reached, use existing data + if previous_data is not None and (datetime.datetime.strptime(previous_data["gtfs_updated_at"],'%Y-%m-%dT%H:%M:%S.%f%z') + timedelta(minutes=options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL))) > dt_util.utcnow() + timedelta(seconds=1) : + run_static = False + _LOGGER.debug("No run static refresh: sensor exists but not yet refresh for name: %s", data["name"]) + #2. sensor exists and refresh interval reached, get static data + else: + run_static = True + _LOGGER.debug("Run static refresh: sensor without gtfs data OR refresh for name: %s", data["name"]) - check_index = await self.hass.async_add_executor_job( - check_datasource_index, self._pygtfs - ) - - try: - self._data["next_departure"] = await self.hass.async_add_executor_job( - get_next_departure, self - ) - except Exception as ex: # pylint: disable=broad-except - _LOGGER.error("Error getting gtfs data from generic helper: %s", ex) - _LOGGER.debug("GTFS coordinator data from helper: %s", self._data["next_departure"]) - return self._data - -class GTFSRealtimeUpdateCoordinator(DataUpdateCoordinator): - """Data update coordinator for the GTFSRT integration.""" - - config_entry: ConfigEntry - - - def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: - """Initialize the coordinator.""" - _LOGGER.debug("GTFS RT: coordinator init") - super().__init__( - hass=hass, - logger=_LOGGER, - name=entry.entry_id, - update_interval=timedelta(minutes=entry.options.get("refresh_interval_rt", DEFAULT_REFRESH_INTERVAL_RT)), - ) - self.config_entry = entry - self.hass = hass - self._data: dict[str, str] = {} - - async def _async_update_data(self) -> dict[str, str]: - """Update.""" - data = self.config_entry.data - options = self.config_entry.options - _LOGGER.debug("GTFS RT: coordinator async_update_data: %s", data) - _LOGGER.debug("GTFS RT: coordinator async_update_data options: %s", options) - #add real_time if setup - - - if "real_time" in options: + if not run_static: + # do nothing awaiting refresh interval and use existing data + self._data = previous_data + else: + check_index = await self.hass.async_add_executor_job( + check_datasource_index, self + ) + + try: + self._data["next_departure"] = await self.hass.async_add_executor_job( + get_next_departure, self + ) + self._data["gtfs_updated_at"] = dt_util.utcnow().isoformat() + except Exception as ex: # pylint: disable=broad-except + _LOGGER.error("Error getting gtfs data from generic helper: %s", ex) + return None + _LOGGER.debug("GTFS coordinator data from helper: %s", self._data["next_departure"]) - """Initialize the info object.""" - self._trip_update_url = options["trip_update_url"] - self._vehicle_position_url = options["vehicle_position_url"] - self._route_delimiter = "-" -# if options["CONF_API_KEY"] is not None: -# self._headers = {"Authorization": options["CONF_API_KEY"]} -# elif options["CONF_X_API_KEY"] is not None: -# self._headers = {"x-api-key": options["CONF_X_API_KEY"]} -# else: -# self._headers = None - self._headers = None - self.info = {} - self._route_id = data["route"].split(": ")[0] - self._stop_id = data["origin"].split(": ")[0] - self._direction = data["direction"] - self._relative = False - #_LOGGER.debug("GTFS RT: Realtime data: %s", self._data) - self._data = await self.hass.async_add_executor_job(get_rt_route_statuses, self) - self._get_next_service = await self.hass.async_add_executor_job(get_next_services, self) - _LOGGER.debug("GTFS RT: Realtime next service: %s", self._get_next_service) + # collect and return rt attributes + # STILL REQUIRES A SOLUTION IF TIMING OUT + if "real_time" in options: + if options["real_time"]: + self._get_next_service = {} + """Initialize the info object.""" + self._trip_update_url = options["trip_update_url"] + self._vehicle_position_url = options["vehicle_position_url"] + self._route_delimiter = "-" + if CONF_API_KEY in options: + self._headers = {"Authorization": options[CONF_API_KEY]} + elif CONF_X_API_KEY in options: + self._headers = {"x-api-key": options[CONF_X_API_KEY]} + else: + self._headers = None + self._headers = None + self.info = {} + try: + self._route_id = self._data["next_departure"]["route_id"] + except Exception as ex: # pylint: disable=broad-except + _LOGGER.error("Error getting entity route_id for realtime data, for origin: %s with error: %s", data["origin"], ex) + self._route_id = data["route"].split(": ")[0] + self._stop_id = data["origin"].split(": ")[0] + self._trip_id = self._data.get('next_departure', {}).get('trip_id', None) + self._direction = data["direction"] + self._relative = False + try: + self._get_rt_route_statuses = await self.hass.async_add_executor_job(get_rt_route_statuses, self) + self._get_rt_trip_statuses = await self.hass.async_add_executor_job(get_rt_trip_statuses, self) + self._get_next_service = await self.hass.async_add_executor_job(get_next_services, self) + self._data["next_departure_realtime_attr"] = self._get_next_service + self._data["next_departure_realtime_attr"]["gtfs_rt_updated_at"] = dt_util.utcnow() + except Exception as ex: # pylint: disable=broad-except + _LOGGER.error("Error getting gtfs realtime data, for origin: %s with error: %s", data["origin"], ex) + else: + _LOGGER.debug("GTFS RT: RealTime = false, selected in entity options") else: - _LOGGER.error("GTFS RT: Issue with entity options") - return "---" - - return self._get_next_service + _LOGGER.debug("GTFS RT: RealTime not selected in entity options") + + return self._data diff --git a/custom_components/gtfs2/gtfs_helper.py b/custom_components/gtfs2/gtfs_helper.py index e2b0a67..4dbce7f 100644 --- a/custom_components/gtfs2/gtfs_helper.py +++ b/custom_components/gtfs2/gtfs_helper.py @@ -4,18 +4,27 @@ import datetime import logging import os +import json import requests import pygtfs from sqlalchemy.sql import text +import multiprocessing import homeassistant.util.dt as dt_util from homeassistant.core import HomeAssistant +from .const import DEFAULT_PATH_GEOJSON + _LOGGER = logging.getLogger(__name__) def get_next_departure(self): _LOGGER.debug("Get next departure with data: %s", self._data) + + if check_extracting(self): + _LOGGER.warning("Cannot get next depurtures on this datasource as still unpacking: %s", self._data["file"]) + return {} + """Get next departures from data.""" if self.hass.config.time_zone is None: _LOGGER.error("Timezone is not set in Home Assistant configuration") @@ -39,6 +48,7 @@ def get_next_departure(self): # days. limit = 24 * 60 * 60 * 2 tomorrow_select = tomorrow_where = tomorrow_order = "" + tomorrow_calendar_date_where = f"AND (calendar_date_today.date = :today)" if include_tomorrow: _LOGGER.debug("Include Tomorrow") limit = int(limit / 2 * 3) @@ -46,6 +56,7 @@ def get_next_departure(self): tomorrow_select = f"calendar.{tomorrow_name} AS tomorrow," tomorrow_where = f"OR calendar.{tomorrow_name} = 1" tomorrow_order = f"calendar.{tomorrow_name} DESC," + tomorrow_calendar_date_where = f"AND (calendar_date_today.date = :today or calendar_date_today.date = :tomorrow)" sql_query = f""" SELECT trip.trip_id, trip.route_id,trip.trip_headsign,route.route_long_name, @@ -135,7 +146,7 @@ def get_next_departure(self): WHERE start_station.stop_id = :origin_station_id AND end_station.stop_id = :end_station_id AND origin_stop_sequence < dest_stop_sequence - AND (calendar_date_today.date = :today or calendar_date_today.date = :tomorrow) + {tomorrow_calendar_date_where} ORDER BY today_cd, origin_depart_time """ # noqa: S608 result = schedule.engine.connect().execute( @@ -199,24 +210,29 @@ def get_next_departure(self): "Departure found for station %s @ %s -> %s", start_station_id, key, item ) break - + _LOGGER.debug("item: %s", item) + if item == {}: + data_returned = { + "gtfs_updated_at": dt_util.utcnow().isoformat(), + } + _LOGGER.info("No items found in gtfs") return {} # create upcoming timetable timetable_remaining = [] for key in sorted(timetable.keys()): if datetime.datetime.strptime(key, "%Y-%m-%d %H:%M:%S") > now: - timetable_remaining.append(key) + timetable_remaining.append(dt_util.as_utc(datetime.datetime.strptime(key, "%Y-%m-%d %H:%M:%S")).isoformat()) _LOGGER.debug( "Timetable Remaining Departures on this Start/Stop: %s", timetable_remaining ) # create upcoming timetable with line info timetable_remaining_line = [] - for key2, value in sorted(timetable.items()): - if datetime.datetime.strptime(key2, "%Y-%m-%d %H:%M:%S") > now: + for key, value in sorted(timetable.items()): + if datetime.datetime.strptime(key, "%Y-%m-%d %H:%M:%S") > now: timetable_remaining_line.append( - str(key2) + " (" + str(value["route_long_name"]) + ")" + str(dt_util.as_utc(datetime.datetime.strptime(key, "%Y-%m-%d %H:%M:%S")).isoformat()) + " (" + str(value["route_long_name"]) + ")" ) _LOGGER.debug( "Timetable Remaining Departures on this Start/Stop, per line: %s", @@ -224,10 +240,10 @@ def get_next_departure(self): ) # create upcoming timetable with headsign timetable_remaining_headsign = [] - for key2, value in sorted(timetable.items()): - if datetime.datetime.strptime(key2, "%Y-%m-%d %H:%M:%S") > now: + for key, value in sorted(timetable.items()): + if datetime.datetime.strptime(key, "%Y-%m-%d %H:%M:%S") > now: timetable_remaining_headsign.append( - str(key2) + " (" + str(value["trip_headsign"]) + ")" + str(dt_util.as_utc(datetime.datetime.strptime(key, "%Y-%m-%d %H:%M:%S")).isoformat()) + " (" + str(value["trip_headsign"]) + ")" ) _LOGGER.debug( "Timetable Remaining Departures on this Start/Stop, with headsign: %s", @@ -236,7 +252,15 @@ def get_next_departure(self): # Format arrival and departure dates and times, accounting for the # possibility of times crossing over midnight. + _tomorrow = item.get("tomorrow") origin_arrival = now + dest_arrival = now + origin_depart_time = f"{now_date} {item['origin_depart_time']}" + if _tomorrow == 1: + origin_arrival = tomorrow + dest_arrival = tomorrow + origin_depart_time = f"{tomorrow_date} {item['origin_depart_time']}" + if item["origin_arrival_time"] > item["origin_depart_time"]: origin_arrival -= datetime.timedelta(days=1) origin_arrival_time = ( @@ -244,11 +268,8 @@ def get_next_departure(self): f"{item['origin_arrival_time']}" ) - origin_depart_time = f"{now_date} {item['origin_depart_time']}" - - dest_arrival = now if item["dest_arrival_time"] < item["origin_depart_time"]: - dest_arrival += datetime.timedelta(days=1) + dest_arrival += datetime.timedelta(days=1) dest_arrival_time = ( f"{dest_arrival.strftime(dt_util.DATE_STR_FORMAT)} {item['dest_arrival_time']}" ) @@ -260,8 +281,13 @@ def get_next_departure(self): f"{dest_depart.strftime(dt_util.DATE_STR_FORMAT)} {item['dest_depart_time']}" ) - depart_time = dt_util.parse_datetime(origin_depart_time).replace(tzinfo=timezone) + # align on timezone + depart_time = dt_util.parse_datetime(origin_depart_time).replace(tzinfo=timezone) arrival_time = dt_util.parse_datetime(dest_arrival_time).replace(tzinfo=timezone) + origin_arrival_time = dt_util.as_utc(datetime.datetime.strptime(origin_arrival_time, "%Y-%m-%d %H:%M:%S")).isoformat() + origin_depart_time = dt_util.as_utc(datetime.datetime.strptime(origin_depart_time, "%Y-%m-%d %H:%M:%S")).isoformat() + dest_arrival_time = dt_util.as_utc(datetime.datetime.strptime(dest_arrival_time, "%Y-%m-%d %H:%M:%S")).isoformat() + dest_depart_time = dt_util.as_utc(datetime.datetime.strptime(dest_depart_time, "%Y-%m-%d %H:%M:%S")).isoformat() origin_stop_time = { "Arrival Time": origin_arrival_time, @@ -284,8 +310,8 @@ def get_next_departure(self): "Sequence": item["dest_stop_sequence"], "Timepoint": item["dest_stop_timepoint"], } - - return { + + data_returned = { "trip_id": item["trip_id"], "route_id": item["route_id"], "day": item["day"], @@ -299,18 +325,28 @@ def get_next_departure(self): "next_departures_lines": timetable_remaining_line, "next_departures_headsign": timetable_remaining_headsign, } - + _LOGGER.debug("Data returned: %s", data_returned) + + return data_returned def get_gtfs(hass, path, data, update=False): """Get gtfs file.""" _LOGGER.debug("Getting gtfs with data: %s", data) + gtfs_dir = hass.config.path(path) + os.makedirs(gtfs_dir, exist_ok=True) filename = data["file"] url = data["url"] file = data["file"] + ".zip" - gtfs_dir = hass.config.path(path) - os.makedirs(gtfs_dir, exist_ok=True) - if update and os.path.exists(os.path.join(gtfs_dir, file)): + sqlite = data["file"] + ".sqlite" + journal = os.path.join(gtfs_dir, filename + ".sqlite-journal") + if os.path.exists(journal) and not update : + _LOGGER.warning("Cannot use this datasource as still unpacking: %s", filename) + return "extracting" + if update and data["extract_from"] == "url" and os.path.exists(os.path.join(gtfs_dir, file)): remove_datasource(hass, path, filename) + + if update and data["extract_from"] == "zip" and os.path.exists(os.path.join(gtfs_dir, file)) and os.path.exists(os.path.join(gtfs_dir, sqlite)): + os.remove(os.path.join(gtfs_dir, sqlite)) if data["extract_from"] == "zip": if not os.path.exists(os.path.join(gtfs_dir, file)): _LOGGER.error("The given GTFS zipfile was not found") @@ -326,18 +362,13 @@ def get_gtfs(hass, path, data, update=False): (gtfs_root, _) = os.path.splitext(file) sqlite_file = f"{gtfs_root}.sqlite?check_same_thread=False" - joined_path = os.path.join(gtfs_dir, sqlite_file) - _LOGGER.debug("unpacking: %s", joined_path) - gtfs = pygtfs.Schedule(joined_path) - # check or wait for unpack - journal = os.path.join(gtfs_dir, filename + ".sqlite-journal") - while os.path.isfile(journal): - time.sleep(10) + joined_path = os.path.join(gtfs_dir, sqlite_file) + gtfs = pygtfs.Schedule(joined_path) if not gtfs.feeds: + _LOGGER.info("Starting gtfs file unpacking: %s", joined_path) pygtfs.append_feed(gtfs, os.path.join(gtfs_dir, file)) return gtfs - def get_route_list(schedule): sql_routes = f""" SELECT route_id, route_short_name, route_long_name from routes @@ -353,7 +384,7 @@ def get_route_list(schedule): row = row_cursor._asdict() routes_list.append(list(row_cursor)) for x in routes_list: - val = x[0] + ": " + x[1] + " (" + x[2] + ")" + val = str(x[0]) + ": " + str(x[1]) + " (" + str(x[2]) + ")" routes.append(val) _LOGGER.debug(f"routes: {routes}") return routes @@ -390,14 +421,12 @@ def get_datasources(hass, path) -> dict[str]: _LOGGER.debug(f"Datasources path: {path}") gtfs_dir = hass.config.path(path) os.makedirs(gtfs_dir, exist_ok=True) - _LOGGER.debug(f"Datasources folder: {gtfs_dir}") files = os.listdir(gtfs_dir) - _LOGGER.debug(f"Datasources files: {files}") datasources = [] for file in files: if file.endswith(".sqlite"): - datasources.append(file.split(".")[0]) - _LOGGER.debug(f"datasources: {datasources}") + datasources.append(file.split(".")[0]) + _LOGGER.debug(f"Datasources in folder: {datasources}") return datasources @@ -407,9 +436,23 @@ def remove_datasource(hass, path, filename): os.remove(os.path.join(gtfs_dir, filename + ".zip")) os.remove(os.path.join(gtfs_dir, filename + ".sqlite")) return "removed" + +def check_extracting(self): + gtfs_dir = self.hass.config.path(self._data["gtfs_dir"]) + filename = self._data["file"] + journal = os.path.join(gtfs_dir, filename + ".sqlite-journal") + if os.path.exists(journal) : + _LOGGER.debug("check extracting: yes") + return True + return False -def check_datasource_index(schedule): +def check_datasource_index(self): + _LOGGER.debug("Check datasource with data: %s", self._data) + if check_extracting(self): + _LOGGER.warning("Cannot check indexes on this datasource as still unpacking: %s", self._data["file"]) + return + schedule=self._pygtfs sql_index_1 = f""" SELECT count(*) as checkidx FROM sqlite_master @@ -422,12 +465,21 @@ def check_datasource_index(schedule): WHERE type= 'index' and tbl_name = 'stop_times' and name like '%stop_id%'; """ + sql_index_3 = f""" + SELECT count(*) as checkidx + FROM sqlite_master + WHERE + type= 'index' and tbl_name = 'shapes' and name like '%shape_id%'; + """ sql_add_index_1 = f""" create index gtfs2_stop_times_trip_id on stop_times(trip_id) """ sql_add_index_2 = f""" create index gtfs2_stop_times_stop_id on stop_times(stop_id) """ + sql_add_index_3 = f""" + create index gtfs2_shapes_shape_id on shapes(shape_id) + """ result_1a = schedule.engine.connect().execute( text(sql_index_1), {"q": "q"}, @@ -435,7 +487,7 @@ def check_datasource_index(schedule): for row_cursor in result_1a: _LOGGER.debug("IDX result1: %s", row_cursor._asdict()) if row_cursor._asdict()['checkidx'] == 0: - _LOGGER.info("Adding index 1 to improve performance") + _LOGGER.debug("Adding index 1 to improve performance") result_1b = schedule.engine.connect().execute( text(sql_add_index_1), {"q": "q"}, @@ -448,8 +500,52 @@ def check_datasource_index(schedule): for row_cursor in result_2a: _LOGGER.debug("IDX result2: %s", row_cursor._asdict()) if row_cursor._asdict()['checkidx'] == 0: - _LOGGER.info("Adding index 2 to improve performance") + _LOGGER.debug("Adding index 2 to improve performance") result_2b = schedule.engine.connect().execute( text(sql_add_index_2), {"q": "q"}, - ) \ No newline at end of file + ) + + result_3a = schedule.engine.connect().execute( + text(sql_index_3), + {"q": "q"}, + ) + for row_cursor in result_3a: + _LOGGER.debug("IDX result3: %s", row_cursor._asdict()) + if row_cursor._asdict()['checkidx'] == 0: + _LOGGER.debug("Adding index 3 to improve performance") + result_3b = schedule.engine.connect().execute( + text(sql_add_index_3), + {"q": "q"}, + ) + +def create_trip_geojson(self): + _LOGGER.debug("GTFS Helper, create geojson with data: %s", self._data) + schedule = self._data["schedule"] + self._trip_id = self._data["next_departure"]["trip_id"] + sql_shape = f""" + SELECT t.trip_id, s.shape_pt_lat, s.shape_pt_lon + FROM trips t, shapes s + WHERE + t.shape_id = s.shape_id + and t.trip_id = '{self._trip_id}' + order by s.shape_pt_sequence + """ + result = schedule.engine.connect().execute( + text(sql_shape), + {"q": "q"}, + ) + + shapes_list = [] + coordinates = [] + for row_cursor in result: + row = row_cursor._asdict() + shapes_list.append(list(row_cursor)) + for x in shapes_list: + coordinate = [] + coordinate.append(x[2]) + coordinate.append(x[1]) + coordinates.append(coordinate) + self.geojson = {"features": [{"geometry": {"coordinates": coordinates, "type": "LineString"}, "properties": {"id": self._trip_id, "title": self._trip_id}, "type": "Feature"}], "type": "FeatureCollection"} + _LOGGER.debug("Geojson: %s", json.dumps(self.geojson)) + return None \ No newline at end of file diff --git a/custom_components/gtfs2/gtfs_rt_helper.py b/custom_components/gtfs2/gtfs_rt_helper.py index b8b83b0..7d2e824 100644 --- a/custom_components/gtfs2/gtfs_rt_helper.py +++ b/custom_components/gtfs2/gtfs_rt_helper.py @@ -1,6 +1,10 @@ import logging from datetime import datetime, timedelta +import json +import os + + import homeassistant.helpers.config_validation as cv import homeassistant.util.dt as dt_util import requests @@ -13,63 +17,42 @@ _LOGGER = logging.getLogger(__name__) -ATTR_STOP_ID = "Stop ID" -ATTR_ROUTE = "Route" -ATTR_DIRECTION_ID = "Direction ID" -ATTR_DUE_IN = "Due in" -ATTR_DUE_AT = "Due at" -ATTR_NEXT_UP = "Next Service" -ATTR_ICON = "Icon" -ATTR_UNIT_OF_MEASUREMENT = "unit_of_measurement" -ATTR_DEVICE_CLASS = "device_class" - -CONF_API_KEY = "api_key" -CONF_X_API_KEY = "x_api_key" -CONF_STOP_ID = "stopid" -CONF_ROUTE = "route" -CONF_DIRECTION_ID = "directionid" -CONF_DEPARTURES = "departures" -CONF_TRIP_UPDATE_URL = "trip_update_url" -CONF_VEHICLE_POSITION_URL = "vehicle_position_url" -CONF_ROUTE_DELIMITER = "route_delimiter" -CONF_ICON = "icon" -CONF_SERVICE_TYPE = "service_type" -CONF_RELATIVE_TIME = "show_relative_time" - -DEFAULT_SERVICE = "Service" -DEFAULT_ICON = "mdi:bus" -DEFAULT_DIRECTION = "0" - -MIN_TIME_BETWEEN_UPDATES = timedelta(seconds=60) -TIME_STR_FORMAT = "%H:%M" - -PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend( - { - vol.Required(CONF_TRIP_UPDATE_URL): cv.string, - vol.Optional(CONF_API_KEY): cv.string, - vol.Optional(CONF_X_API_KEY): cv.string, - vol.Optional(CONF_VEHICLE_POSITION_URL): cv.string, - vol.Optional(CONF_ROUTE_DELIMITER): cv.string, - - vol.Optional(CONF_DEPARTURES): [ - { - vol.Required(CONF_NAME): cv.string, - vol.Required(CONF_STOP_ID): cv.string, - vol.Required(CONF_ROUTE): cv.string, - vol.Optional(CONF_RELATIVE_TIME, default=True): cv.boolean, - vol.Optional( - CONF_DIRECTION_ID, - default=DEFAULT_DIRECTION, # type: ignore - ): str, - vol.Optional( - CONF_ICON, default=DEFAULT_ICON # type: ignore - ): cv.string, - vol.Optional( - CONF_SERVICE_TYPE, default=DEFAULT_SERVICE # type: ignore - ): cv.string, - } - ], - } +from .const import ( + + ATTR_STOP_ID, + ATTR_ROUTE, + ATTR_TRIP, + ATTR_DIRECTION_ID, + ATTR_DUE_IN, + ATTR_DUE_AT, + ATTR_NEXT_UP, + ATTR_ICON, + ATTR_UNIT_OF_MEASUREMENT, + ATTR_DEVICE_CLASS, + ATTR_LATITUDE, + ATTR_LONGITUDE, + + CONF_API_KEY, + CONF_X_API_KEY, + CONF_STOP_ID, + CONF_ROUTE, + CONF_DIRECTION_ID, + CONF_DEPARTURES, + CONF_TRIP_UPDATE_URL, + CONF_VEHICLE_POSITION_URL, + CONF_ROUTE_DELIMITER, + CONF_ICON, + CONF_SERVICE_TYPE, + CONF_RELATIVE_TIME, + + DEFAULT_SERVICE, + DEFAULT_ICON, + DEFAULT_DIRECTION, + DEFAULT_PATH, + DEFAULT_PATH_GEOJSON, + + TIME_STR_FORMAT + ) def due_in_minutes(timestamp): @@ -98,6 +81,8 @@ def log_debug(data: list, indent_level: int) -> None: def get_gtfs_feed_entities(url: str, headers, label: str): + _LOGGER.debug(f"GTFS RT get_feed_entities for url: {url} , headers: {headers}, label: {label}") + feed = gtfs_realtime_pb2.FeedMessage() # type: ignore # TODO add timeout to requests call @@ -114,21 +99,30 @@ def get_gtfs_feed_entities(url: str, headers, label: str): 0, ) - feed.ParseFromString(response.content) + feed.ParseFromString(response.content) + #_LOGGER.debug("Feed entity: %s", feed.entity) return feed.entity - - -## reworked for gtfs2 - def get_next_services(self): - self.data = self._data + self.data = self._get_rt_route_statuses self._stop = self._stop_id self._route = self._route_id + self._trip = self._trip_id self._direction = self._direction - _LOGGER.debug("Get Next Services, route/direction/stop: %s", self.data.get(self._route, {}).get(self._direction, {}).get(self._stop, [])) - + _LOGGER.debug("RT route: %s", self._route) + _LOGGER.debug("RT trip: %s", self._trip) + _LOGGER.debug("RT stop: %s", self._stop) + _LOGGER.debug("RT direction: %s", self._direction) next_services = self.data.get(self._route, {}).get(self._direction, {}).get(self._stop, []) + _LOGGER.debug("Next services route_id: %s", next_services) + if not next_services: + self._direction = 0 + self.data2 = self._get_rt_trip_statuses + next_services = self.data2.get(self._trip, {}).get(self._direction, {}).get(self._stop, []) + _LOGGER.debug("Next services trip_id: %s", next_services) + if next_services: + _LOGGER.debug("Next services trip_id[0].arrival_time: %s", next_services[0].arrival_time) + if self.hass.config.time_zone is None: _LOGGER.error("Timezone is not set in Home Assistant configuration") timezone = "UTC" @@ -136,22 +130,63 @@ def get_next_services(self): timezone=dt_util.get_time_zone(self.hass.config.time_zone) if self._relative : - return ( + + due_in = ( due_in_minutes(next_services[0].arrival_time) if len(next_services) > 0 else "-" ) else: - return ( - next_services[0].arrival_time.replace(tzinfo=timezone) + due_in = ( + dt_util.as_utc(next_services[0].arrival_time) if len(next_services) > 0 else "-" ) + attrs = { + ATTR_DUE_IN: due_in, + ATTR_STOP_ID: self._stop, + ATTR_ROUTE: self._route, + ATTR_TRIP: self._trip, + ATTR_DIRECTION_ID: self._direction, + ATTR_LATITUDE: "", + ATTR_LONGITUDE: "" + } + if len(next_services) > 0: + attrs[ATTR_DUE_AT] = ( + next_services[0].arrival_time.strftime(TIME_STR_FORMAT) + if len(next_services) > 0 + else "-" + ) + if next_services[0].position: + if next_services[0].position[0]: + attrs[ATTR_LATITUDE] = next_services[0].position[0][1] + attrs[ATTR_LONGITUDE] = next_services[0].position[0][0] + if len(next_services) > 1: + attrs[ATTR_NEXT_UP] = ( + next_services[1].arrival_time.strftime(TIME_STR_FORMAT) + if len(next_services) > 1 + else "-" + ) + if self._relative : + attrs[ATTR_UNIT_OF_MEASUREMENT] = "min" + else : + attrs[ATTR_DEVICE_CLASS] = ( + "timestamp" + if len(next_services) > 0 + else "" + ) + + _LOGGER.debug("GTFS RT next services attributes: %s", attrs) + return attrs + def get_rt_route_statuses(self): - + vehicle_positions = {} + if self._vehicle_position_url != "" : + vehicle_positions = get_rt_vehicle_positions(self) + class StopDetails: def __init__(self, arrival_time, position): @@ -166,22 +201,23 @@ def __init__(self, arrival_time, position): for entity in feed_entities: if entity.HasField("trip_update"): + # OUTCOMMENTED as spamming even debug log # If delimiter specified split the route ID in the gtfs rt feed - log_debug( - [ - "Received Trip ID", - entity.trip_update.trip.trip_id, - "Route ID:", - entity.trip_update.trip.route_id, - "direction ID", - entity.trip_update.trip.direction_id, - "Start Time:", - entity.trip_update.trip.start_time, - "Start Date:", - entity.trip_update.trip.start_date, - ], - 1, - ) + #log_debug( + #[ + # "Received Trip ID", + # entity.trip_update.trip.trip_id, + # "Route ID:", + # entity.trip_update.trip.route_id, + # "direction ID", + # entity.trip_update.trip.direction_id, + # "Start Time:", + # entity.trip_update.trip.start_time, + # "Start Date:", + # entity.trip_update.trip.start_date, + #], + #1, + #) if self._route_delimiter is not None: route_id_split = entity.trip_update.trip.route_id.split( self._route_delimiter @@ -190,21 +226,23 @@ def __init__(self, arrival_time, position): route_id = entity.trip_update.trip.route_id else: route_id = route_id_split[0] - log_debug( - [ - "Feed Route ID", - entity.trip_update.trip.route_id, - "changed to", - route_id, - ], - 1, - ) + # OUTCOMMENTED as spamming even debug log + #log_debug( + # [ + # "Feed Route ID", + # entity.trip_update.trip.route_id, + # "changed to", + # route_id, + # ], + # 1, + #) + else: route_id = entity.trip_update.trip.route_id if route_id not in departure_times: - departure_times[route_id] = {} + departure_times[route_id] = {} if entity.trip_update.trip.direction_id is not None: direction_id = str(entity.trip_update.trip.direction_id) @@ -225,40 +263,38 @@ def __init__(self, arrival_time, position): stop_time = stop.departure.time else: stop_time = stop.arrival.time - log_debug( - [ - "Stop:", - stop_id, - "Stop Sequence:", - stop.stop_sequence, - "Stop Time:", - stop_time, - ], - 2, - ) + #log_debug( + # [ + # "Stop:", + # stop_id, + # "Stop Sequence:", + # stop.stop_sequence, + # "Stop Time:", + # stop_time, + # ], + # 2, + #) # Ignore arrival times in the past if due_in_minutes(datetime.fromtimestamp(stop_time)) >= 0: - log_debug( - [ - "Adding route ID", - route_id, - "trip ID", - entity.trip_update.trip.trip_id, - "direction ID", - entity.trip_update.trip.direction_id, - "stop ID", - stop_id, - "stop time", - stop_time, - ], - 3, - ) + #log_debug( + # [ + # "Adding route ID", + # route_id, + # "trip ID", + # entity.trip_update.trip.trip_id, + # "direction ID", + # entity.trip_update.trip.direction_id, + # "stop ID", + # stop_id, + # "stop time", + # stop_time, + # ], + # 3, + #) details = StopDetails( datetime.fromtimestamp(stop_time), - vehicle_positions.get( - entity.trip_update.trip.trip_id - ), + [d["properties"].get(entity.trip_update.trip.trip_id) for d in vehicle_positions], ) departure_times[route_id][direction_id][ stop_id @@ -273,20 +309,116 @@ def __init__(self, arrival_time, position): ) self.info = departure_times - + + #_LOGGER.debug("Departure times: %s", departure_times) return departure_times + +def get_rt_trip_statuses(self): + + vehicle_positions = {} + + if self._vehicle_position_url != "" : + vehicle_positions = get_rt_vehicle_positions(self) + + class StopDetails: + def __init__(self, arrival_time, position): + self.arrival_time = arrival_time + self.position = position + + departure_times = {} + + feed_entities = get_gtfs_feed_entities( + url=self._trip_update_url, headers=self._headers, label="trip data" + ) + + for entity in feed_entities: + if entity.HasField("trip_update"): + trip_id = entity.trip_update.trip.trip_id + #_LOGGER.debug("RT Trip, trip: %s", trip) + #_LOGGER.debug("RT Trip, trip_id: %s", self._trip_id) + + if trip_id == self._trip_id: + _LOGGER.debug("RT Trip, found trip: %s", trip_id) + + if trip_id not in departure_times: + departure_times[trip_id] = {} + + if entity.trip_update.trip.direction_id is not None: + direction_id = str(entity.trip_update.trip.direction_id) + else: + direction_id = DEFAULT_DIRECTION + if direction_id not in departure_times[trip_id]: + departure_times[trip_id][direction_id] = {} + + for stop in entity.trip_update.stop_time_update: + stop_id = stop.stop_id + if not departure_times[trip_id][direction_id].get( + stop_id + ): + departure_times[trip_id][direction_id][stop_id] = [] + # Use stop arrival time; + # fall back on departure time if not available + if stop.arrival.time == 0: + stop_time = stop.departure.time + else: + stop_time = stop.arrival.time + #log_debug( + # [ + # "Stop:", + # stop_id, + # "Stop Sequence:", + # stop.stop_sequence, + # "Stop Time:", + # stop_time, + # ], + # 2, + #) + # Ignore arrival times in the past + if due_in_minutes(datetime.fromtimestamp(stop_time)) >= 0: + #log_debug( + # [ + # "Adding trip ID", + # entity.trip_update.trip.trip_id, + # "direction ID", + # entity.trip_update.trip.direction_id, + # "stop ID", + # stop_id, + # "stop time", + # stop_time, + # ], + # 3, + #) + + details = StopDetails( + datetime.fromtimestamp(stop_time), + [d["properties"].get(entity.trip_update.trip.trip_id) for d in vehicle_positions], + ) + departure_times[trip_id][direction_id][ + stop_id + ].append(details) + + # Sort by arrival time + for trip in departure_times: + for direction in departure_times[trip]: + for stop in departure_times[trip][direction]: + departure_times[trip][direction][stop].sort( + key=lambda t: t.arrival_time + ) + + self.info = departure_times + #_LOGGER.debug("Departure times Trip: %s", departure_times) + return departure_times def get_rt_vehicle_positions(self): - positions = {} feed_entities = get_gtfs_feed_entities( url=self._vehicle_position_url, headers=self._headers, label="vehicle positions", ) - + geojson_body = [] + geojson_element = {"geometry": {"coordinates":[],"type": "Point"}, "properties": {"id": "", "title": "", "trip_id": "", "route_id": "", "direction_id": "", "vehicle_id": "", "vehicle_label": ""}, "type": "Feature"} for entity in feed_entities: vehicle = entity.vehicle - if not vehicle.trip.trip_id: # Vehicle is not in service continue @@ -294,15 +426,47 @@ def get_rt_vehicle_positions(self): [ "Adding position for trip ID", vehicle.trip.trip_id, + "route ID", + vehicle.trip.route_id, + "direction ID", + vehicle.trip.direction_id, "position latitude", vehicle.position.latitude, "longitude", vehicle.position.longitude, ], 2, - ) - - positions[vehicle.trip.trip_id] = vehicle.position - - return positions + ) + + #construct geojson only for configured rout/direction + if str(self._route_id) == str(vehicle.trip.route_id) and str(self._direction) == str(vehicle.trip.direction_id): + geojson_element = {"geometry": {"coordinates":[],"type": "Point"}, "properties": {"id": "", "title": "", "trip_id": "", "route_id": "", "direction_id": "", "vehicle_id": "", "vehicle_label": ""}, "type": "Feature"} + geojson_element["geometry"]["coordinates"] = [] + geojson_element["geometry"]["coordinates"].append(vehicle.position.longitude) + geojson_element["geometry"]["coordinates"].append(vehicle.position.latitude) + geojson_element["properties"]["id"] = str(vehicle.trip.route_id) + "(" + str(vehicle.trip.direction_id) + ")" + geojson_element["properties"]["title"] = str(vehicle.trip.route_id) + "(" + str(vehicle.trip.direction_id) + ")" + geojson_element["properties"]["trip_id"] = vehicle.trip.trip_id + geojson_element["properties"]["route_id"] = vehicle.trip.route_id + geojson_element["properties"]["direction_id"] = vehicle.trip.direction_id + geojson_element["properties"]["vehicle_id"] = "tbd" + geojson_element["properties"]["vehicle_label"] = "tbd" + geojson_element["properties"][vehicle.trip.trip_id] = geojson_element["geometry"]["coordinates"] + geojson_body.append(geojson_element) + + self.geojson = {"features": geojson_body, "type": "FeatureCollection"} + + _LOGGER.debug("GTFS RT geojson: %s", json.dumps(self.geojson)) + self._route_dir = self._route_id + "_" + self._direction + update_geojson(self) + return geojson_body + + +def update_geojson(self): + geojson_dir = self.hass.config.path(DEFAULT_PATH_GEOJSON) + os.makedirs(geojson_dir, exist_ok=True) + file = os.path.join(geojson_dir, self._route_dir + ".json") + _LOGGER.debug("GTFS RT geojson file: %s", file) + with open(file, "w") as outfile: + json.dump(self.geojson, outfile) diff --git a/custom_components/gtfs2/manifest.json b/custom_components/gtfs2/manifest.json index 8dbe4c3..63fac40 100644 --- a/custom_components/gtfs2/manifest.json +++ b/custom_components/gtfs2/manifest.json @@ -7,5 +7,5 @@ "iot_class": "local_polling", "issue_tracker": "https://github.com/vingerha/gtfs2/issues", "requirements": ["pygtfs==0.1.9","gtfs-realtime-bindings==1.0.0"], - "version": "0.1.5" + "version": "0.1.6" } diff --git a/custom_components/gtfs2/sensor.py b/custom_components/gtfs2/sensor.py index 30800fd..7a79572 100644 --- a/custom_components/gtfs2/sensor.py +++ b/custom_components/gtfs2/sensor.py @@ -22,6 +22,7 @@ ATTR_DROP_OFF_ORIGIN, ATTR_FIRST, ATTR_INFO, + ATTR_INFO_RT, ATTR_LAST, ATTR_LOCATION_DESTINATION, ATTR_LOCATION_ORIGIN, @@ -120,22 +121,24 @@ def icon(self) -> str: def _update_attrs(self): # noqa: C901 PLR0911 _LOGGER.debug(f"SENSOR update attr DATA: {self.coordinator.data}") self._pygtfs = self.coordinator.data["schedule"] + self.extracting = self.coordinator.data["extracting"] self.origin = self.coordinator.data["origin"].split(": ")[0] self.destination = self.coordinator.data["destination"].split(": ")[0] self._include_tomorrow = self.coordinator.data["include_tomorrow"] self._offset = self.coordinator.data["offset"] - self._departure = self.coordinator.data["next_departure"] + self._departure = self.coordinator.data.get("next_departure",None) + self._departure_rt = self.coordinator.data.get("next_departure_realtime_attr",None) self._available = False self._icon = ICON self._state: datetime.datetime | None = None self._attr_device_class = SensorDeviceClass.TIMESTAMP - self._origin = None - self._destination = None self._trip = None self._route = None self._agency = None + self._origin = None + self._destination = None # Fetch valid stop information once - if not self._origin: + if not self._origin and not self.extracting: stops = self._pygtfs.stops_by_id(self.origin) if not stops: self._available = False @@ -143,25 +146,28 @@ def _update_attrs(self): # noqa: C901 PLR0911 return self._origin = stops[0] - if not self._destination: + if not self._destination and not self.extracting: stops = self._pygtfs.stops_by_id(self.destination) if not stops: self._available = False - _LOGGER.warning("Destination stop ID %s not found", self.destination) + _LOGGER.warning( + "Destination stop ID %s not found", self.destination + ) return self._destination = stops[0] + # Fetch trip and route details once, unless updated if not self._departure: self._trip = None else: - trip_id = self._departure["trip_id"] - if not self._trip or self._trip.trip_id != trip_id: + trip_id = self._departure.get("trip_id") + if not self.extracting and (not self._trip or self._trip.trip_id != trip_id): _LOGGER.debug("Fetching trip details for %s", trip_id) self._trip = self._pygtfs.trips_by_id(trip_id)[0] - route_id = self._departure["route_id"] - if not self._route or self._route.route_id != route_id: + route_id = self._departure.get("route_id") + if not self.extracting and (not self._route or self._route.route_id != route_id): _LOGGER.debug("Fetching route details for %s", route_id) self._route = self._pygtfs.routes_by_id(route_id)[0] @@ -170,7 +176,7 @@ def _update_attrs(self): # noqa: C901 PLR0911 if not self._departure: self._next_departures = None else: - self._next_departures = self._departure["next_departures"] + self._next_departures = self._departure.get("next_departures",None) # Fetch agency details exactly once if self._agency is None and self._route: @@ -194,7 +200,7 @@ def _update_attrs(self): # noqa: C901 PLR0911 elif self._agency: _LOGGER.debug( "Self._departure time for state value TZ: %s ", - {self._departure["departure_time"]}, + {self._departure.get("departure_time")}, ) self._state = self._departure["departure_time"].replace( tzinfo=dt_util.get_time_zone(self._agency.agency_timezone) @@ -202,9 +208,9 @@ def _update_attrs(self): # noqa: C901 PLR0911 else: _LOGGER.debug( "Self._departure time from helper: %s", - {self._departure["departure_time"]}, + {self._departure.get("departure_time")}, ) - self._state = self._departure["departure_time"] + self._state = self._departure.get("departure_time") # settin state value self._attr_native_value = self._state @@ -221,7 +227,7 @@ def _update_attrs(self): # noqa: C901 PLR0911 name = ( f"{getattr(self._agency, 'agency_name', DEFAULT_NAME)} " - f"{self.origin} to {self.destination} next departure" + f"{self._origin} to {self._destination} next departure" ) if not self._departure: name = f"{DEFAULT_NAME}" @@ -230,7 +236,7 @@ def _update_attrs(self): # noqa: C901 PLR0911 # Add departure information if self._departure: self._attributes[ATTR_ARRIVAL] = dt_util.as_utc( - self._departure["arrival_time"] + self._departure.get("arrival_time") ).isoformat() self._attributes[ATTR_DAY] = self._departure["day"] @@ -259,9 +265,9 @@ def _update_attrs(self): # noqa: C901 PLR0911 if self._state is None: self._attributes[ATTR_INFO] = ( - "No more departures" + "No more departures or extracting new data" if self._include_tomorrow - else "No more departures today" + else "No more departures today or extracting new data" ) elif ATTR_INFO in self._attributes: del self._attributes[ATTR_INFO] @@ -338,10 +344,11 @@ def _update_attrs(self): # noqa: C901 PLR0911 ) else: self.remove_keys(prefix) - - _LOGGER.debug( - "Destination_stop_time %s", self._departure["destination_stop_time"] - ) + + if "destination_stop_time" in self._departure: + _LOGGER.debug("Destination_stop_time %s", self._departure["destination_stop_time"]) + else: + _LOGGER.warning("No destination_stop_time") prefix = "destination_stop" if self._departure: self.append_keys(self._departure["destination_stop_time"], prefix) @@ -363,26 +370,40 @@ def _update_attrs(self): # noqa: C901 PLR0911 prefix = "next_departures" self._attributes["next_departures"] = [] if self._next_departures: - self._attributes["next_departures"] = self._departure["next_departures"][ - :10 - ] + self._attributes["next_departures"] = self._departure[ + "next_departures"][:10] # Add next departures with their lines prefix = "next_departures_lines" self._attributes["next_departures_lines"] = [] if self._next_departures: self._attributes["next_departures_lines"] = self._departure[ - "next_departures_lines" - ][:10] + "next_departures_lines"][:10] # Add next departures with their headsign prefix = "next_departures_headsign" self._attributes["next_departures_headsign"] = [] if self._next_departures: self._attributes["next_departures_headsign"] = self._departure[ - "next_departures_headsign" - ][:10] - - self._attributes["updated_at"] = dt_util.now().replace(tzinfo=None) + "next_departures_headsign"][:10] + + self._attributes["gtfs_updated_at"] = self.coordinator.data[ + "gtfs_updated_at"] + + if self._departure_rt: + _LOGGER.debug("next dep realtime attr: %s", self._departure_rt) + # Add next departure realtime to the right level, only if populated + if "gtfs_rt_updated_at" in self._departure_rt: + self._attributes["gtfs_rt_updated_at"] = self._departure_rt["gtfs_rt_updated_at"] + self._attributes["next_departure_realtime"] = self._departure_rt["Due in"] + self._attributes["latitude"] = self._departure_rt["latitude"] + self._attributes["longitude"] = self._departure_rt["longitude"] + if ATTR_INFO_RT in self._attributes: + del self._attributes[ATTR_INFO_RT] + else: + _LOGGER.debug("No next departure realtime attributes") + self._attributes[ATTR_INFO_RT] = ( + "No realtime information" + ) self._attr_extra_state_attributes = self._attributes return self._attr_extra_state_attributes @@ -410,38 +431,3 @@ def remove_keys(self, prefix: str) -> None: self._attributes = { k: v for k, v in self._attributes.items() if not k.startswith(prefix) } - - -class GTFSRealtimeDepartureSensor(CoordinatorEntity): - """Implementation of a GTFS departure sensor.""" - - def __init__(self, coordinator: GTFSRealtimeUpdateCoordinator) -> None: - """Initialize the GTFSsensor.""" - super().__init__(coordinator) - self._name = coordinator.data["name"] + "_rt" - self._attributes: dict[str, Any] = {} - - self._attr_unique_id = f"gtfs-{self._name}_rt" - self._attr_device_info = DeviceInfo( - name=f"GTFS - {self._name}", - entry_type=DeviceEntryType.SERVICE, - identifiers={(DOMAIN, f"GTFS - {self._name}_rt")}, - manufacturer="GTFS", - model=self._name, - ) - _LOGGER.debug("GTFS RT Sensor: coordinator data: %s", coordinator.data ) - self._coordinator = coordinator - self._attributes = self._update_attrs_rt() - self._attr_extra_state_attributes = self._attributes - - @callback - def _handle_coordinator_update(self) -> None: - """Handle updated data from the coordinator.""" - self._update_attrs_rt() - super()._handle_coordinator_update() - - def _update_attrs_rt(self): # noqa: C901 PLR0911 - _LOGGER.debug(f"GTFS RT Sensor update attr DATA: {self._coordinator.data}") - self._attr_native_value = coordinator.data - self._attributes["next_departure_realtime"] = self._coordinator.data - return self._attributes \ No newline at end of file diff --git a/custom_components/gtfs2/services.yaml b/custom_components/gtfs2/services.yaml index 379675d..00d2629 100644 --- a/custom_components/gtfs2/services.yaml +++ b/custom_components/gtfs2/services.yaml @@ -1,19 +1,32 @@ # Describes the format for available ADS services update_gtfs: name: Update GTFS Data - description: Collects a new gtfs zip and unpacks it to sqlite + description: Unpacks source to gtfs-db fields: - name: + extract_from: + name: Indicate source of the data + description: Select if you update from url or zip + required: true + example: "url" + default: "url" + selector: + select: + translation_key: "extract_from" + options: + - "url" + - "zip" + file: name: Name of the transport service, without .zip description: If you use the same name as an existing one, the existing one will be overwitten required: true example: "mytransportservice" selector: - text: + text: url: name: URL description: provide the full path to the zip file itself required: true + default: "na" example: "https://path-to-my-zip-file-location/filename.zip" selector: - text: + text: diff --git a/custom_components/gtfs2/strings.json b/custom_components/gtfs2/strings.json index 2426849..d8110e1 100644 --- a/custom_components/gtfs2/strings.json +++ b/custom_components/gtfs2/strings.json @@ -50,7 +50,8 @@ "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", "files_deleted": "Datasource deleted, this may affect existing routes", "stop_incorrect": "Start and/or End destination incorrect, possibly not in same direction, check logs", - "no_zip_file": "Data collection issue: ZIP file not in the correct folder" + "no_zip_file": "Data collection issue: ZIP file not in the correct folder", + "extracting": "Extracting data, this can take a while" } }, "options": { @@ -65,7 +66,9 @@ "description": "Provide url to real time API", "data": { "trip_update_url": "URL to trip data", - "vehicle_position_url": "URL to vehicle position (can be the same as trip data)" + "vehicle_position_url": "URL to vehicle position (can be the same as trip data)", + "api_key": "API key, if required", + "x_api_key": "X_API key, if required" } } } diff --git a/custom_components/gtfs2/translations/en.json b/custom_components/gtfs2/translations/en.json index 66d1da6..66a9e03 100644 --- a/custom_components/gtfs2/translations/en.json +++ b/custom_components/gtfs2/translations/en.json @@ -10,7 +10,8 @@ "source": { "data": { "file": "New datasource name", - "url": "external url to gtfs data (zip) file" + "url": "external url to gtfs data (zip) file", + "extract_from": "Extract data from:" }, "description": "NOTE: with a new url/zip, this may take quite a bit of time, \n depending on file-size and system performance" }, @@ -25,7 +26,6 @@ "origin": "Origin Stop", "destination": "Destination Stop", "name": "Name of the route", - "offset": "Offset in minutes", "refresh_interval": "Refresh interval in minutes", "include_tomorrow": "Include tomorrow" } @@ -45,7 +45,8 @@ "files_deleted": "Datasource deleted, this may affect existing routes", "stop_incorrect": "Start and/or End destination incorrect, possibly no transport 'today' or not in same direction, check logs", "no_data_file": "Data collection issue: URL incorrect or filename not in the correct folder", - "no_zip_file": "Data collection issue: ZIP file not existing in the correct folder, note that it is capital-sensitive" + "no_zip_file": "Data collection issue: ZIP file not existing in the correct folder, note that it is capital-sensitive", + "extracting": "Extracting data, this can take a while" } }, "options": { @@ -53,16 +54,54 @@ "init": { "description": "Customize the way the integration works", "data": { - "refresh_interval": "Data refresh interval (in minutes)" + "refresh_interval": "Data refresh interval (in minutes)", + "offset": "Offset in minutes", + "real_time": "Setup Realtime integration? \n (needs data from the same source)" } }, "real_time": { "description": "Provide url to real time API", "data": { "trip_update_url": "URL to trip data", - "vehicle_position_url": "URL to vehicle position (can be the same as trip data)" + "vehicle_position_url": "URL to vehicle position (or same as trip data)", + "api_key": "API key, if required", + "x_api_key": "X_API key, if required" } } } + }, + "selector": { + "extract_from": { + "options": { + "zip": "ZIP: expects a file in gtfs2-folder with below name, without extension .zip", + "url": "URL: uses your URL below, leave 'na' if using zip" + } + }, + "direction": { + "options": { + "0": "Outbound", + "1": "Return" + } + } + }, + "services": { + "update_gtfs": { + "name": "Updates a GTFS2 datasource", + "description": "Either via Link or placing a Zip yourselves in gtfs2", + "fields": { + "extract_from": { + "name": "Indicate source to use zip or url", + "description": "" + }, + "file": { + "name": "Name of the transport service, without .zip", + "description": "If you use the same name as an existing one, the existing one will be overwitten" + }, + "url": { + "name": "URL", + "description": "provide the full path to the zip file itself" + } + } + } } } diff --git a/custom_components/gtfs2/translations/fr.json b/custom_components/gtfs2/translations/fr.json index 72f02bc..5625565 100644 --- a/custom_components/gtfs2/translations/fr.json +++ b/custom_components/gtfs2/translations/fr.json @@ -10,9 +10,10 @@ "source": { "data": { "file": "Nom de la nouvelle source de données", - "url": "URL externe vers le fichier (zip) des données GTFS" + "url": "URL externe vers le fichier (zip) des données GTFS", + "extract_from": "Collecte données de:" }, - "description": "REMARQUE: avec une nouvelle URL/zip, cela peut prendre du temps après la soumission, \n selon la taille du fichier et performance du serveur" + "description": "REMARQUE: avec une nouvelle URL/zip, cela peut prendre du temps après la soumission, selon la taille du fichier et performance du serveur" }, "route": { "data": { @@ -25,7 +26,6 @@ "origin": "Arrêt d'origine", "destination": "Arrêt de destination", "name": "Nom de la ligne", - "offset": "Décalage en minutes", "refresh_interval": "Intervalle d'actualisation en minutes", "include_tomorrow": "Inclure le lendemain?" } @@ -45,7 +45,8 @@ "files_deleted": "Source de données supprimée, cela peut affecter les itinéraires existants", "stop_incorrect": "Arrêt de départ et/ou de fin incorrecte, éventuellement pas de transport « aujourd'hui » ou pas dans la même direction, vérifiez logs d'érreur", "no_data_file": "Problème de collecte de données : URL incorrecte ou nom de fichier ne se trouve pas dans le bon dossier", - "no_zip_file": "Problème de collecte de données : fichier ZIP ne se trouve pas dans le bon dossier, note: sensible à la casse" + "no_zip_file": "Problème de collecte de données : fichier ZIP ne se trouve pas dans le bon dossier, note: sensible à la casse", + "extracting": "Extraction des données, cela prend du temps" } }, "options": { @@ -53,9 +54,54 @@ "init": { "description": "Personnalisez le fonctionnement de l'intégration", "data": { - "refresh_interval": "Personnalisez le fonctionnement de l'intégration" + "refresh_interval": "Intervalle d'actualisation en minutes", + "offset": "Décalage en minutes", + "real_time": "Ajoute intégration temps réel? \n (nécessite données de la même source)" } + }, + "real_time": { + "description": "URL vers données temps réel", + "data": { + "trip_update_url": "URL vers: trip data", + "vehicle_position_url": "URL vers: position véhicule (ou trip data)", + "api_key": "API key, si nécessaire", + "x_api_key": "X_API key, si nécessaire" + } + } + } + }, + "selector": { + "extract_from": { + "options": { + "zip": "ZIP: attend un fichier dans dossier 'gtfs2' avec le même nom, sans extension .zip", + "url": "URL: utilise l'URL, laisse 'na' si zip" + } + }, + "direction": { + "options": { + "0": "Aller", + "1": "Retour" } } + }, + "services": { + "update_gtfs": { + "name": "MAJ d'un GTFS2 datasource", + "description": "Utiliser un lien ou placer votre fichier ZIP dans le dossier gtfs2", + "fields": { + "extract_from": { + "name": "Collecte données de:", + "description": "" + }, + "file": { + "name": "Nom du Service, sans ajouter .zip", + "description": "A noter: si déjà existant, l'ancien sera remplacé" + }, + "url": { + "name": "URL externe vers le fichier (zip) des données GTFS, laissez le 'na' si zip", + "description": "A noter: si déjà existant, l'ancien sera remplacé" + } + } + } } } \ No newline at end of file diff --git a/example.md b/example.md index 523df8e..dd266c6 100644 --- a/example.md +++ b/example.md @@ -42,116 +42,113 @@ You can add a optional area ![image](https://github.com/vingerha/gtfs2/assets/44190435/f2f855f9-bc07-405d-8b0b-09b3da7e4f79) +## CONFIGURE Options + +After setup you can change the refresh interval and add real-time source(s) + +![image](https://github.com/vingerha/gtfs2/assets/44190435/03135ba3-e9ff-4fe6-a23b-bb1f0a44c6ea) + +![image](https://github.com/vingerha/gtfs2/assets/44190435/11de0f3c-ac1b-4b4d-8712-38764dfc5bd4) + +![image](https://github.com/vingerha/gtfs2/assets/44190435/5895e947-882d-444e-9259-e56d7d5e426a) + + + + + Sample of the entity and its attributes ``` -arrival: "2023-11-04T09:42:29+00:00" +arrival: 2023-11-18T12:18:00+00:00 day: today first: false last: false offset: 0 -agency_agency_id: LE MET -agency_agency_name: LE MET' -agency_agency_url: https://lemet.fr +agency_agency_id: None +agency_agency_name: TAO (Orléans) +agency_agency_url: http://reseau-tao.fr/ agency_agency_timezone: Europe/Paris -agency_agency_lang: FR -agency_agency_phone: 0.800.00.29.38 -agency_agency_fare_url: https://services.lemet.fr/fr/billetterie -agency_agency_email: contact@lemet.fr -origin_station_stop_id: "6010" +agency_agency_lang: fr +agency_agency_phone: 0800012000 +agency_agency_fare_url: None +agency_agency_email: None +origin_station_stop_id: ORLEANS:StopArea:00026500 origin_station_stop_code: None -origin_station_stop_name: P+R WOIPPY +origin_station_stop_name: Gaston Galloux origin_station_stop_desc: None -origin_station_stop_lat: "49.150349" -origin_station_stop_lon: "6.173323" +origin_station_stop_lat: 47.884827 +origin_station_stop_lon: 1.924645 origin_station_zone_id: None -origin_station_stop_url: https://services.lemet.fr/fr/biv/arret/1627 -origin_station_location_type: "0" +origin_station_stop_url: None +origin_station_location_type: 0 origin_station_parent_station: None -origin_station_stop_timezone: None -origin_station_wheelchair_boarding: "1" +origin_station_stop_timezone: Europe/Paris +origin_station_wheelchair_boarding: 0 origin_station_platform_code: None origin_station_location_type_name: Station -origin_station_wheelchair_boarding_available: true -destination_station_stop_id: "6180" +origin_station_wheelchair_boarding_available: unknown +destination_station_stop_id: ORLEANS:StopArea:01001712 destination_station_stop_code: None -destination_station_stop_name: FELIX ALCAN +destination_station_stop_name: Gare d'Orléans - Quai E destination_station_stop_desc: None -destination_station_stop_lat: "49.112572" -destination_station_stop_lon: "6.199158" +destination_station_stop_lat: 47.907085 +destination_station_stop_lon: 1.90578 destination_station_zone_id: None -destination_station_stop_url: https://services.lemet.fr/fr/biv/arret/7324 -destination_station_location_type: "0" +destination_station_stop_url: None +destination_station_location_type: 0 destination_station_parent_station: None -destination_station_stop_timezone: None -destination_station_wheelchair_boarding: "1" +destination_station_stop_timezone: Europe/Paris +destination_station_wheelchair_boarding: 0 destination_station_platform_code: None destination_station_location_type_name: Station -destination_station_wheelchair_boarding_available: true -route_route_id: A-98 -route_agency_id: LE MET -route_route_short_name: MA -route_route_long_name: METTIS A +destination_station_wheelchair_boarding_available: unknown +route_route_id: ORLEANS:Line:40 +route_agency_id: None +route_route_short_name: 40 +route_route_long_name: GARE ORLEANS - PETITE MERIE route_route_desc: None -route_route_type: "3" +route_route_type: 3 route_route_url: None -route_route_color: F0980C -route_route_text_color: FFFFFF +route_route_color: 24A472 +route_route_text_color: 000000 route_type_name: Bus -trip_route_id: A-98 -trip_service_id: HIV2324-Sam_Sp23-Samedi-21 -trip_trip_id: 1281546-HIV2324-Sam_Sp23-Samedi-21 -trip_trip_headsign: MA - BORNY +trip_route_id: ORLEANS:Line:40 +trip_service_id: chouette:TimeTable:4f12e6e5-93ca-4af2-b493-0858f5c73e39:LOC +trip_trip_id: ORLEANS:VehicleJourney:40_A_56_16_4002_6_124300 +trip_trip_headsign: None trip_trip_short_name: None -trip_direction_id: "0" -trip_block_id: "196205" -trip_shape_id: A0014 -trip_wheelchair_accessible: "1" -trip_bikes_allowed: "2" -trip_bikes_allowed_state: false -trip_wheelchair_access_available: true -origin_stop_arrival_time: "2023-11-04 10:16:00" -origin_stop_departure_time: "2023-11-04 10:16:00" -origin_stop_drop_off_type: 0 +trip_direction_id: 0 +trip_block_id: None +trip_shape_id: PME-CNY-POSC-GARE +trip_wheelchair_accessible: None +trip_bikes_allowed: None +trip_bikes_allowed_state: unknown +trip_wheelchair_access_available: unknown +origin_stop_arrival_time: 2023-11-18T12:09:05+00:00 +origin_stop_departure_time: 2023-11-18T12:09:05+00:00 origin_stop_pickup_type: 0 -origin_stop_sequence: 1 -origin_stop_drop_off_type_state: Regular +origin_stop_sequence: 17 +origin_stop_drop_off_type_state: unknown origin_stop_pickup_type_state: Regular origin_stop_timepoint_exact: true -destination_stop_arrival_time: "2023-11-04 10:42:29" -destination_stop_departure_time: "2023-11-04 10:42:29" -destination_stop_drop_off_type: 0 +destination_stop_arrival_time: 2023-11-18T12:18:00+00:00 +destination_stop_departure_time: 2023-11-18T12:18:00+00:00 destination_stop_pickup_type: 0 -destination_stop_sequence: 19 -destination_stop_drop_off_type_state: Regular +destination_stop_sequence: 23 +destination_stop_drop_off_type_state: unknown destination_stop_pickup_type_state: Regular destination_stop_timepoint_exact: true -next_departures: - - "2023-11-04 10:16:00" - - "2023-11-04 10:31:00" - - "2023-11-04 10:46:00" - - "2023-11-04 11:01:00" - - "2023-11-04 11:16:00" - - "2023-11-04 11:31:00" - - "2023-11-04 11:46:00" - - "2023-11-04 12:01:00" - - "2023-11-04 12:16:00" - - "2023-11-04 12:31:00" -next_departures_lines: - - 2023-11-04 10:16:00 (METTIS A) - - 2023-11-04 10:31:00 (METTIS A) - - 2023-11-04 10:46:00 (METTIS A) - - 2023-11-04 11:01:00 (METTIS A) - - 2023-11-04 11:16:00 (METTIS A) - - 2023-11-04 11:31:00 (METTIS A) - - 2023-11-04 11:46:00 (METTIS A) - - 2023-11-04 12:01:00 (METTIS A) - - 2023-11-04 12:16:00 (METTIS A) - - 2023-11-04 12:31:00 (METTIS A) -updated_at: "2023-11-04T10:07:07.085514" -attribution: LE MET' +next_departures: 2023-11-18T12:09:05+00:00, 2023-11-18T12:39:05+00:00, 2023-11-18T13:10:05+00:00, 2023-11-18T13:40:05+00:00, 2023-11-18T14:10:05+00:00, 2023-11-18T14:40:05+00:00, 2023-11-18T15:11:05+00:00, 2023-11-18T15:41:05+00:00, 2023-11-18T16:12:05+00:00, 2023-11-18T16:42:05+00:00 +next_departures_lines: 2023-11-18T12:09:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T12:39:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T13:10:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T13:40:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T14:10:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T14:40:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T15:11:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T15:41:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T16:12:05+00:00 (GARE ORLEANS - PETITE MERIE), 2023-11-18T16:42:05+00:00 (GARE ORLEANS - PETITE MERIE) +next_departures_headsign: 2023-11-18T12:09:05+00:00 (None), 2023-11-18T12:39:05+00:00 (None), 2023-11-18T13:10:05+00:00 (None), 2023-11-18T13:40:05+00:00 (None), 2023-11-18T14:10:05+00:00 (None), 2023-11-18T14:40:05+00:00 (None), 2023-11-18T15:11:05+00:00 (None), 2023-11-18T15:41:05+00:00 (None), 2023-11-18T16:12:05+00:00 (None), 2023-11-18T16:42:05+00:00 (None) +gtfs_updated_at: 2023-11-18T11:38:52.654949+00:00 +gtfs_rt_updated_at: 2023-11-18T11:40:59.832457+00:00 +next_departure_realtime: 2023-11-18T12:09:30+00:00 +latitude: +longitude: +attribution: TAO (Orléans) device_class: timestamp icon: mdi:bus -friendly_name: MyRouteInMetz +friendly_name: Orleans 40 outbound ```