From e1c25a9e4199ca8897b0ec83bfe11d9b60e27f59 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Wed, 8 Nov 2023 10:47:15 +0100 Subject: [PATCH 01/18] Update issue templates --- .github/ISSUE_TEMPLATE/bug_report.md | 23 +++++++++++++++++++++++ 1 file changed, 23 insertions(+) create mode 100644 .github/ISSUE_TEMPLATE/bug_report.md diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md new file mode 100644 index 0000000..09bd002 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -0,0 +1,23 @@ +--- +name: Bug report +about: Create a report to help us improve +title: '' +labels: '' +assignees: '' + +--- + +**Describe the bug** +A clear and concise description of what the bug is. + +Steps/data to reproduce the behavior, e.g. +- url to the zip file +- route ID +- stop ID +- outward/return + +**Release used** +Which gtfs2 release and HA type (HAOS/Container) + +**Additional** +Please add logs if helpfull From cc1a262119e9f5a471f015723ebc582b9885890a Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Wed, 8 Nov 2023 10:48:38 +0100 Subject: [PATCH 02/18] Update issue templates --- .github/ISSUE_TEMPLATE/feature_request.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) create mode 100644 .github/ISSUE_TEMPLATE/feature_request.md diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md new file mode 100644 index 0000000..f26a893 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/feature_request.md @@ -0,0 +1,17 @@ +--- +name: Feature request +about: Suggest an idea for this project +title: "[FEATURE]: " +labels: '' +assignees: '' + +--- + +**Describe the solution you'd like** +A clear and concise description of what you want to happen. + +**Describe alternatives you've considered** +A clear and concise description of any alternative solutions or features you've considered. + +**Additional context** +Add any other context or screenshots about the feature request here. From 2a292ed524f8410ff4c45420f2fe81d8fd3b488f Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 09:50:36 +0100 Subject: [PATCH 03/18] Update example.md --- example.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/example.md b/example.md index c31fd08..8f799ec 100644 --- a/example.md +++ b/example.md @@ -13,9 +13,10 @@ In this case that is: https://data.lemet.fr/documents/LEMET-gtfs.zip ![image](https://github.com/vingerha/gtfs2/assets/44190435/3688925f-63cd-451a-9db1-313a028c2188) -NOTE: this will download and unpack the zip-file to a sqlite database, which can take (many) minutes, **please be patient** +NOTE: this will download and unpack the zip-file to a sqlite database, which can take time (examples from 10mins to 2hrs), **please be patient** + +![image](https://github.com/vingerha/gtfs2/assets/44190435/dd26f517-1cd9-4386-b4ea-c605d02a0ac7) -![image](https://github.com/vingerha/gtfs2/assets/44190435/02ab24ed-c10d-43e5-8c3e-f221044a1a9e) ## Select the route From bc905d32521feb62a23ede9d6e3e73606f31bbb5 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 09:54:59 +0100 Subject: [PATCH 04/18] Update README.md --- README.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 83b098a..028a266 100644 --- a/README.md +++ b/README.md @@ -17,7 +17,11 @@ Core GTFS uses start + stop, it then determines every option between them and pr ***Solution/workaround in GTFS2***: attribute added: next_departure_line shows all next departues with their line/means-of-transport. So even if you select a route first and then two stops, the attibutes will still show alternatives between those 2 stops, if applicable. ## Updates -- 20231104: initial version +20231110: adding features: +- timezone check is now in order: agency (deliverung data), if not > HA system, if not > UTC +- attribute next_departure_headsigns +- adding route shortname in selection/list +20231104: initial version ## ToDo's - Issue when updating the source db, it throws a db locked error. This when an existing entity for the same db starts polling it at the same time From a4851c7b3006f6cd5219401f15c19ce64ca94c6a Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 09:56:55 +0100 Subject: [PATCH 05/18] Update example.md --- example.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/example.md b/example.md index 8f799ec..523df8e 100644 --- a/example.md +++ b/example.md @@ -11,7 +11,9 @@ In this case that is: https://data.lemet.fr/documents/LEMET-gtfs.zip ![image](https://github.com/vingerha/gtfs2/assets/44190435/7dd77425-07f8-45d0-8d0c-d9948fca6fbb) -![image](https://github.com/vingerha/gtfs2/assets/44190435/3688925f-63cd-451a-9db1-313a028c2188) +### Two options, either extract from a file you placed in the gtfs2 folder Or use a url + +![image](https://github.com/vingerha/gtfs2/assets/44190435/e64cb7d9-7b68-4169-9cc4-e216a303f7d3) NOTE: this will download and unpack the zip-file to a sqlite database, which can take time (examples from 10mins to 2hrs), **please be patient** From be5ff2f6e0a15d812bdae48be91d2e25a2537e44 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 09:59:00 +0100 Subject: [PATCH 06/18] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 028a266..eb13f0b 100644 --- a/README.md +++ b/README.md @@ -21,6 +21,7 @@ Core GTFS uses start + stop, it then determines every option between them and pr - timezone check is now in order: agency (deliverung data), if not > HA system, if not > UTC - attribute next_departure_headsigns - adding route shortname in selection/list +- for new datasource, allow to use a self-placed zip file in the gtfs2 folder. This for zip that are not available via URL or zip with data that may need modification to comply with extraction conditions by pygtfs 20231104: initial version ## ToDo's From 33e9ce28e0a5174168756e9f3270f4c3851991a4 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 09:59:24 +0100 Subject: [PATCH 07/18] Update README.md --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index eb13f0b..b732874 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,8 @@ Core GTFS uses start + stop, it then determines every option between them and pr - timezone check is now in order: agency (deliverung data), if not > HA system, if not > UTC - attribute next_departure_headsigns - adding route shortname in selection/list -- for new datasource, allow to use a self-placed zip file in the gtfs2 folder. This for zip that are not available via URL or zip with data that may need modification to comply with extraction conditions by pygtfs +- for new datasource, allow to use a self-placed zip file in the gtfs2 folder. This for zip that are not available via URL or zip with data that may need modification to comply with extraction conditions by pygtfs + 20231104: initial version ## ToDo's From f975c5e6d57f035e87b5e1c3cae32c32a64b9219 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 10:00:57 +0100 Subject: [PATCH 08/18] New Features timezone check is now in order: agency (deliverung data), if not > HA system, if not > UTC attribute next_departure_headsigns adding route shortname in selection/list for new datasource, allow to use a self-placed zip file in the gtfs2 folder. This for zip that are not available via URL or zip with data that may need modification to comply with extraction conditions by pygtfs --- custom_components/gtfs2/__init__.py | 37 ++- custom_components/gtfs2/config_flow.py | 89 +++++- custom_components/gtfs2/const.py | 6 +- custom_components/gtfs2/coordinator.py | 71 ++++- custom_components/gtfs2/gtfs_helper.py | 74 +++-- custom_components/gtfs2/gtfs_rt_helper.py | 308 +++++++++++++++++++ custom_components/gtfs2/manifest.json | 2 +- custom_components/gtfs2/sensor.py | 57 +++- custom_components/gtfs2/strings.json | 25 +- custom_components/gtfs2/translations/en.json | 26 +- custom_components/gtfs2/translations/fr.json | 19 +- 11 files changed, 645 insertions(+), 69 deletions(-) create mode 100644 custom_components/gtfs2/gtfs_rt_helper.py diff --git a/custom_components/gtfs2/__init__.py b/custom_components/gtfs2/__init__.py index ae5c8de..fa688c5 100644 --- a/custom_components/gtfs2/__init__.py +++ b/custom_components/gtfs2/__init__.py @@ -5,13 +5,32 @@ from homeassistant.config_entries import ConfigEntry from homeassistant.core import HomeAssistant, ServiceCall -from .const import DOMAIN, PLATFORMS, DEFAULT_PATH -from .coordinator import GTFSUpdateCoordinator +from datetime import timedelta + +from .const import DOMAIN, PLATFORMS, DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL +from .coordinator import GTFSUpdateCoordinator, GTFSRealtimeUpdateCoordinator import voluptuous as vol from .gtfs_helper import get_gtfs _LOGGER = logging.getLogger(__name__) +async def async_migrate_entry(hass, config_entry: ConfigEntry) -> bool: + """Migrate old entry.""" + _LOGGER.debug("Migrating from version %s", config_entry.version) + + if config_entry.version == 1: + + new = {**config_entry.data} + new['extract_from'] = 'url' + new.pop('refresh_interval') + + config_entry.version = 2 + hass.config_entries.async_update_entry(config_entry, data=new) + + _LOGGER.debug("Migration to version %s successful", config_entry.version) + + return True + async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: """Set up GTFS from a config entry.""" @@ -19,12 +38,14 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: coordinator = GTFSUpdateCoordinator(hass, entry) - await coordinator.async_config_entry_first_refresh() - + #await coordinator.async_config_entry_first_refresh() + hass.data[DOMAIN][entry.entry_id] = { "coordinator": coordinator, } - + + entry.async_on_unload(entry.add_update_listener(update_listener)) + await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS) return True @@ -50,3 +71,9 @@ def update_gtfs(call): hass.services.register( DOMAIN, "update_gtfs", update_gtfs) return True + +async def update_listener(hass: HomeAssistant, entry: ConfigEntry): + """Handle options update.""" + hass.data[DOMAIN][entry.entry_id]['coordinator'].update_interval = timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)) + + return True \ No newline at end of file diff --git a/custom_components/gtfs2/config_flow.py b/custom_components/gtfs2/config_flow.py index 54d9018..aeef210 100644 --- a/custom_components/gtfs2/config_flow.py +++ b/custom_components/gtfs2/config_flow.py @@ -8,8 +8,9 @@ from homeassistant import config_entries from homeassistant.data_entry_flow import FlowResult import homeassistant.helpers.config_validation as cv +from homeassistant.core import HomeAssistant, callback -from .const import DEFAULT_PATH, DOMAIN +from .const import DEFAULT_PATH, DOMAIN, DEFAULT_REFRESH_INTERVAL from .gtfs_helper import ( get_gtfs, @@ -33,7 +34,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN): """Handle a config flow for GTFS.""" - VERSION = 1 + VERSION = 2 def __init__(self) -> None: """Init ConfigFlow.""" @@ -67,10 +68,11 @@ async def async_step_user(self, user_input: dict | None = None) -> FlowResult: return await self.async_step_remove() else: user_input["url"] = "na" + user_input["extract_from"] = "zip" self._user_inputs.update(user_input) _LOGGER.debug(f"UserInputs File: {self._user_inputs}") return await self.async_step_route() - + async def async_step_source(self, user_input: dict | None = None) -> FlowResult: """Handle a flow initialized by the user.""" errors: dict[str, str] = {} @@ -78,6 +80,7 @@ async def async_step_source(self, user_input: dict | None = None) -> FlowResult: check_data = await self._check_data(user_input) if check_data: errors["base"] = check_data + return self.async_abort(reason=check_data) else: self._user_inputs.update(user_input) _LOGGER.debug(f"UserInputs Data: {self._user_inputs}") @@ -88,11 +91,12 @@ async def async_step_source(self, user_input: dict | None = None) -> FlowResult: data_schema=vol.Schema( { vol.Required("file"): str, - vol.Required("url"): str, + vol.Required("extract_from"): vol.In({"zip": "Use gtfs2/zipfile with above name, without extension", "url": "Use URL below, leave 'na' if using zip"}), + vol.Required("url", default="na"): str, }, ), errors=errors, - ) + ) async def async_step_remove(self, user_input: dict | None = None) -> FlowResult: """Handle a flow initialized by the user.""" @@ -121,8 +125,7 @@ async def async_step_route(self, user_input: dict | None = None) -> FlowResult: self._pygtfs = get_gtfs( self.hass, DEFAULT_PATH, - self._user_inputs["file"], - self._user_inputs["url"], + self._user_inputs, False, ) errors: dict[str, str] = {} @@ -163,7 +166,6 @@ async def async_step_stops(self, user_input: dict | None = None) -> FlowResult: vol.Required("destination", default=last_stop): vol.In(stops), vol.Required("name"): str, vol.Optional("offset", default=0): int, - vol.Required("refresh_interval", default=15): int, vol.Required("include_tomorrow"): vol.In( {False: "No", True: "Yes"} ), @@ -185,15 +187,15 @@ async def async_step_stops(self, user_input: dict | None = None) -> FlowResult: async def _check_data(self, data): self._pygtfs = await self.hass.async_add_executor_job( - get_gtfs, self.hass, DEFAULT_PATH, data["file"], data["url"], False + get_gtfs, self.hass, DEFAULT_PATH, data, False ) - if self._pygtfs == "no_data_file": - return "no_data_file" + if self._pygtfs == "no_data_file" or "no_zip_file": + return self._pygtfs return None async def _check_config(self, data): self._pygtfs = await self.hass.async_add_executor_job( - get_gtfs, self.hass, DEFAULT_PATH, data["file"], data["url"], False + get_gtfs, self.hass, DEFAULT_PATH, data, False ) if self._pygtfs == "no_data_file": return "no_data_file" @@ -210,7 +212,7 @@ async def _check_config(self, data): try: self._data["next_departure"] = await self.hass.async_add_executor_job( - get_next_departure, self._data + get_next_departure, self ) except Exception as ex: # pylint: disable=broad-except _LOGGER.info( @@ -222,3 +224,64 @@ async def _check_config(self, data): if self._data["next_departure"]: return None return "stop_incorrect" + + @staticmethod + @callback + def async_get_options_flow( + config_entry: config_entries.ConfigEntry, + ) -> config_entries.OptionsFlow: + """Create the options flow.""" + return GTFSOptionsFlowHandler(config_entry) + + +class GTFSOptionsFlowHandler(config_entries.OptionsFlow): + def __init__(self, config_entry: config_entries.ConfigEntry) -> None: + """Initialize options flow.""" + self.config_entry = config_entry + self._data: dict[str, str] = {} + self._user_inputs: dict = {} + + async def async_step_init( + self, user_input: dict[str, Any] | None = None + ) -> FlowResult: + """Manage the options.""" + if user_input is not None: + user_input['real_time'] = False + if user_input['real_time']: + self._user_inputs.update(user_input) + _LOGGER.debug(f"GTFS Options with realtime: {self._user_inputs}") + return await self.async_step_real_time() + else: + self._user_inputs.update(user_input) + _LOGGER.debug(f"GTFS Options without realtime: {self._user_inputs}") + return self.async_create_entry(title="", data=self._user_inputs) + + return self.async_show_form( + step_id="init", + data_schema=vol.Schema( + { + vol.Optional("refresh_interval", default=self.config_entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)): int, +# vol.Required("real_time"): vol.In({False: "No", True: "Yes"}), + } + ), + ) + + async def async_step_real_time( + self, user_input: dict[str, Any] | None = None + ) -> FlowResult: + """Handle a realtime initialized by the user.""" + errors: dict[str, str] = {} + if user_input is not None: + self._user_inputs.update(user_input) + return self.async_create_entry(title="", data=self._user_inputs) + + return self.async_show_form( + step_id="real_time", + data_schema=vol.Schema( + { + vol.Required("trip_update_url"): str, + vol.Required("vehicle_position_url"): str, + }, + ), + errors=errors, + ) \ No newline at end of file diff --git a/custom_components/gtfs2/const.py b/custom_components/gtfs2/const.py index 2b6c88b..865b814 100644 --- a/custom_components/gtfs2/const.py +++ b/custom_components/gtfs2/const.py @@ -37,11 +37,6 @@ ATTR_WHEELCHAIR_DESTINATION = "destination_station_wheelchair_boarding_available" ATTR_WHEELCHAIR_ORIGIN = "origin_station_wheelchair_boarding_available" -CONF_DATA = "data" -CONF_DESTINATION = "destination" -CONF_ORIGIN = "origin" -CONF_TOMORROW = "include_tomorrow" - BICYCLE_ALLOWED_DEFAULT = STATE_UNKNOWN BICYCLE_ALLOWED_OPTIONS = {1: True, 2: False} DROP_OFF_TYPE_DEFAULT = STATE_UNKNOWN @@ -240,3 +235,4 @@ WHEELCHAIR_ACCESS_OPTIONS = {1: True, 2: False} WHEELCHAIR_BOARDING_DEFAULT = STATE_UNKNOWN WHEELCHAIR_BOARDING_OPTIONS = {1: True, 2: False} + diff --git a/custom_components/gtfs2/coordinator.py b/custom_components/gtfs2/coordinator.py index 54e4b35..c04f29a 100644 --- a/custom_components/gtfs2/coordinator.py +++ b/custom_components/gtfs2/coordinator.py @@ -10,12 +10,13 @@ from .const import DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL from .gtfs_helper import get_gtfs, get_next_departure, check_datasource_index +from .gtfs_rt_helper import get_rt_route_statuses, get_next_services _LOGGER = logging.getLogger(__name__) class GTFSUpdateCoordinator(DataUpdateCoordinator): - """Data update coordinator for the Pronote integration.""" + """Data update coordinator for the GTFS integration.""" config_entry: ConfigEntry @@ -25,20 +26,20 @@ def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: hass=hass, logger=_LOGGER, name=entry.entry_id, - update_interval=timedelta( - minutes=entry.data.get("refresh_interval", DEFAULT_REFRESH_INTERVAL) - ), + update_interval=timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)), ) self.config_entry = entry self.hass = hass + self._pygtfs = "" self._data: dict[str, str] = {} async def _async_update_data(self) -> dict[str, str]: """Update.""" data = self.config_entry.data + options = self.config_entry.options self._pygtfs = get_gtfs( - self.hass, DEFAULT_PATH, data["file"], data["url"], False + self.hass, DEFAULT_PATH, data, False ) self._data = { "schedule": self._pygtfs, @@ -56,9 +57,65 @@ async def _async_update_data(self) -> dict[str, str]: try: self._data["next_departure"] = await self.hass.async_add_executor_job( - get_next_departure, self._data + get_next_departure, self ) except Exception as ex: # pylint: disable=broad-except _LOGGER.error("Error getting gtfs data from generic helper: %s", ex) + _LOGGER.debug("GTFS coordinator data from helper: %s", self._data["next_departure"]) + return self._data + +class GTFSRealtimeUpdateCoordinator(DataUpdateCoordinator): + """Data update coordinator for the GTFSRT integration.""" + + config_entry: ConfigEntry + + + def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: + """Initialize the coordinator.""" + _LOGGER.debug("GTFS RT: coordinator init") + super().__init__( + hass=hass, + logger=_LOGGER, + name=entry.entry_id, + update_interval=timedelta(minutes=entry.options.get("refresh_interval_rt", DEFAULT_REFRESH_INTERVAL_RT)), + ) + self.config_entry = entry + self.hass = hass + self._data: dict[str, str] = {} + + async def _async_update_data(self) -> dict[str, str]: + """Update.""" + data = self.config_entry.data + options = self.config_entry.options + _LOGGER.debug("GTFS RT: coordinator async_update_data: %s", data) + _LOGGER.debug("GTFS RT: coordinator async_update_data options: %s", options) + #add real_time if setup + + + if "real_time" in options: + + """Initialize the info object.""" + self._trip_update_url = options["trip_update_url"] + self._vehicle_position_url = options["vehicle_position_url"] + self._route_delimiter = "-" +# if options["CONF_API_KEY"] is not None: +# self._headers = {"Authorization": options["CONF_API_KEY"]} +# elif options["CONF_X_API_KEY"] is not None: +# self._headers = {"x-api-key": options["CONF_X_API_KEY"]} +# else: +# self._headers = None + self._headers = None + self.info = {} + self._route_id = data["route"].split(": ")[0] + self._stop_id = data["origin"].split(": ")[0] + self._direction = data["direction"] + self._relative = False + #_LOGGER.debug("GTFS RT: Realtime data: %s", self._data) + self._data = await self.hass.async_add_executor_job(get_rt_route_statuses, self) + self._get_next_service = await self.hass.async_add_executor_job(get_next_services, self) + _LOGGER.debug("GTFS RT: Realtime next service: %s", self._get_next_service) + else: + _LOGGER.error("GTFS RT: Issue with entity options") + return "---" - return self._data + return self._get_next_service diff --git a/custom_components/gtfs2/gtfs_helper.py b/custom_components/gtfs2/gtfs_helper.py index c6fcb34..e2b0a67 100644 --- a/custom_components/gtfs2/gtfs_helper.py +++ b/custom_components/gtfs2/gtfs_helper.py @@ -9,18 +9,24 @@ from sqlalchemy.sql import text import homeassistant.util.dt as dt_util +from homeassistant.core import HomeAssistant _LOGGER = logging.getLogger(__name__) -def get_next_departure(data): - _LOGGER.debug("Get next departure with data: %s", data) +def get_next_departure(self): + _LOGGER.debug("Get next departure with data: %s", self._data) """Get next departures from data.""" - schedule = data["schedule"] - start_station_id = data["origin"] - end_station_id = data["destination"] - offset = data["offset"] - include_tomorrow = data["include_tomorrow"] + if self.hass.config.time_zone is None: + _LOGGER.error("Timezone is not set in Home Assistant configuration") + timezone = "UTC" + else: + timezone=dt_util.get_time_zone(self.hass.config.time_zone) + schedule = self._data["schedule"] + start_station_id = self._data["origin"] + end_station_id = self._data["destination"] + offset = self._data["offset"] + include_tomorrow = self._data["include_tomorrow"] now = dt_util.now().replace(tzinfo=None) + datetime.timedelta(minutes=offset) now_date = now.strftime(dt_util.DATE_STR_FORMAT) yesterday = now - datetime.timedelta(days=1) @@ -42,7 +48,7 @@ def get_next_departure(data): tomorrow_order = f"calendar.{tomorrow_name} DESC," sql_query = f""" - SELECT trip.trip_id, trip.route_id,route.route_long_name, + SELECT trip.trip_id, trip.route_id,trip.trip_headsign,route.route_long_name, time(origin_stop_time.arrival_time) AS origin_arrival_time, time(origin_stop_time.departure_time) AS origin_depart_time, date(origin_stop_time.departure_time) AS origin_depart_date, @@ -87,7 +93,7 @@ def get_next_departure(data): AND calendar.start_date <= :today AND calendar.end_date >= :today UNION ALL - SELECT trip.trip_id, trip.route_id,route.route_long_name, + SELECT trip.trip_id, trip.route_id,trip.trip_headsign,route.route_long_name, time(origin_stop_time.arrival_time) AS origin_arrival_time, time(origin_stop_time.departure_time) AS origin_depart_time, date(origin_stop_time.departure_time) AS origin_depart_date, @@ -205,7 +211,7 @@ def get_next_departure(data): _LOGGER.debug( "Timetable Remaining Departures on this Start/Stop: %s", timetable_remaining ) - + # create upcoming timetable with line info timetable_remaining_line = [] for key2, value in sorted(timetable.items()): if datetime.datetime.strptime(key2, "%Y-%m-%d %H:%M:%S") > now: @@ -216,6 +222,17 @@ def get_next_departure(data): "Timetable Remaining Departures on this Start/Stop, per line: %s", timetable_remaining_line, ) + # create upcoming timetable with headsign + timetable_remaining_headsign = [] + for key2, value in sorted(timetable.items()): + if datetime.datetime.strptime(key2, "%Y-%m-%d %H:%M:%S") > now: + timetable_remaining_headsign.append( + str(key2) + " (" + str(value["trip_headsign"]) + ")" + ) + _LOGGER.debug( + "Timetable Remaining Departures on this Start/Stop, with headsign: %s", + timetable_remaining_headsign, + ) # Format arrival and departure dates and times, accounting for the # possibility of times crossing over midnight. @@ -243,8 +260,8 @@ def get_next_departure(data): f"{dest_depart.strftime(dt_util.DATE_STR_FORMAT)} {item['dest_depart_time']}" ) - depart_time = dt_util.parse_datetime(origin_depart_time) - arrival_time = dt_util.parse_datetime(dest_arrival_time) + depart_time = dt_util.parse_datetime(origin_depart_time).replace(tzinfo=timezone) + arrival_time = dt_util.parse_datetime(dest_arrival_time).replace(tzinfo=timezone) origin_stop_time = { "Arrival Time": origin_arrival_time, @@ -280,28 +297,37 @@ def get_next_departure(data): "destination_stop_time": destination_stop_time, "next_departures": timetable_remaining, "next_departures_lines": timetable_remaining_line, + "next_departures_headsign": timetable_remaining_headsign, } -def get_gtfs(hass, path, filename, url, update=False): +def get_gtfs(hass, path, data, update=False): """Get gtfs file.""" - file = filename + ".zip" + _LOGGER.debug("Getting gtfs with data: %s", data) + filename = data["file"] + url = data["url"] + file = data["file"] + ".zip" gtfs_dir = hass.config.path(path) os.makedirs(gtfs_dir, exist_ok=True) if update and os.path.exists(os.path.join(gtfs_dir, file)): remove_datasource(hass, path, filename) - - if not os.path.exists(os.path.join(gtfs_dir, file)): - try: - r = requests.get(url, allow_redirects=True) - open(os.path.join(gtfs_dir, file), "wb").write(r.content) - except Exception as ex: # pylint: disable=broad-except - _LOGGER.error("The given URL or GTFS data file/folder was not found") - return "no_data_file" + if data["extract_from"] == "zip": + if not os.path.exists(os.path.join(gtfs_dir, file)): + _LOGGER.error("The given GTFS zipfile was not found") + return "no_zip_file" + if data["extract_from"] == "url": + if not os.path.exists(os.path.join(gtfs_dir, file)): + try: + r = requests.get(url, allow_redirects=True) + open(os.path.join(gtfs_dir, file), "wb").write(r.content) + except Exception as ex: # pylint: disable=broad-except + _LOGGER.error("The given URL or GTFS data file/folder was not found") + return "no_data_file" (gtfs_root, _) = os.path.splitext(file) sqlite_file = f"{gtfs_root}.sqlite?check_same_thread=False" joined_path = os.path.join(gtfs_dir, sqlite_file) + _LOGGER.debug("unpacking: %s", joined_path) gtfs = pygtfs.Schedule(joined_path) # check or wait for unpack journal = os.path.join(gtfs_dir, filename + ".sqlite-journal") @@ -314,7 +340,7 @@ def get_gtfs(hass, path, filename, url, update=False): def get_route_list(schedule): sql_routes = f""" - SELECT route_id, route_long_name from routes + SELECT route_id, route_short_name, route_long_name from routes order by cast(route_id as decimal) """ # noqa: S608 result = schedule.engine.connect().execute( @@ -327,7 +353,7 @@ def get_route_list(schedule): row = row_cursor._asdict() routes_list.append(list(row_cursor)) for x in routes_list: - val = x[0] + ": " + x[1] + val = x[0] + ": " + x[1] + " (" + x[2] + ")" routes.append(val) _LOGGER.debug(f"routes: {routes}") return routes diff --git a/custom_components/gtfs2/gtfs_rt_helper.py b/custom_components/gtfs2/gtfs_rt_helper.py new file mode 100644 index 0000000..b8b83b0 --- /dev/null +++ b/custom_components/gtfs2/gtfs_rt_helper.py @@ -0,0 +1,308 @@ +import logging +from datetime import datetime, timedelta + +import homeassistant.helpers.config_validation as cv +import homeassistant.util.dt as dt_util +import requests +import voluptuous as vol +from google.transit import gtfs_realtime_pb2 +from homeassistant.components.sensor import PLATFORM_SCHEMA +from homeassistant.const import ATTR_LATITUDE, ATTR_LONGITUDE, CONF_NAME +from homeassistant.helpers.entity import Entity +from homeassistant.util import Throttle + +_LOGGER = logging.getLogger(__name__) + +ATTR_STOP_ID = "Stop ID" +ATTR_ROUTE = "Route" +ATTR_DIRECTION_ID = "Direction ID" +ATTR_DUE_IN = "Due in" +ATTR_DUE_AT = "Due at" +ATTR_NEXT_UP = "Next Service" +ATTR_ICON = "Icon" +ATTR_UNIT_OF_MEASUREMENT = "unit_of_measurement" +ATTR_DEVICE_CLASS = "device_class" + +CONF_API_KEY = "api_key" +CONF_X_API_KEY = "x_api_key" +CONF_STOP_ID = "stopid" +CONF_ROUTE = "route" +CONF_DIRECTION_ID = "directionid" +CONF_DEPARTURES = "departures" +CONF_TRIP_UPDATE_URL = "trip_update_url" +CONF_VEHICLE_POSITION_URL = "vehicle_position_url" +CONF_ROUTE_DELIMITER = "route_delimiter" +CONF_ICON = "icon" +CONF_SERVICE_TYPE = "service_type" +CONF_RELATIVE_TIME = "show_relative_time" + +DEFAULT_SERVICE = "Service" +DEFAULT_ICON = "mdi:bus" +DEFAULT_DIRECTION = "0" + +MIN_TIME_BETWEEN_UPDATES = timedelta(seconds=60) +TIME_STR_FORMAT = "%H:%M" + +PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend( + { + vol.Required(CONF_TRIP_UPDATE_URL): cv.string, + vol.Optional(CONF_API_KEY): cv.string, + vol.Optional(CONF_X_API_KEY): cv.string, + vol.Optional(CONF_VEHICLE_POSITION_URL): cv.string, + vol.Optional(CONF_ROUTE_DELIMITER): cv.string, + + vol.Optional(CONF_DEPARTURES): [ + { + vol.Required(CONF_NAME): cv.string, + vol.Required(CONF_STOP_ID): cv.string, + vol.Required(CONF_ROUTE): cv.string, + vol.Optional(CONF_RELATIVE_TIME, default=True): cv.boolean, + vol.Optional( + CONF_DIRECTION_ID, + default=DEFAULT_DIRECTION, # type: ignore + ): str, + vol.Optional( + CONF_ICON, default=DEFAULT_ICON # type: ignore + ): cv.string, + vol.Optional( + CONF_SERVICE_TYPE, default=DEFAULT_SERVICE # type: ignore + ): cv.string, + } + ], + } +) + +def due_in_minutes(timestamp): + """Get the remaining minutes from now until a given datetime object.""" + diff = timestamp - dt_util.now().replace(tzinfo=None) + return int(diff.total_seconds() / 60) + + +def log_info(data: list, indent_level: int) -> None: + indents = " " * indent_level + info_str = f"{indents}{': '.join(str(x) for x in data)}" + _LOGGER.info(info_str) + + +def log_error(data: list, indent_level: int) -> None: + indents = " " * indent_level + info_str = f"{indents}{': '.join(str(x) for x in data)}" + _LOGGER.error(info_str) + + +def log_debug(data: list, indent_level: int) -> None: + indents = " " * indent_level + info_str = f"{indents}{' '.join(str(x) for x in data)}" + _LOGGER.debug(info_str) + + + +def get_gtfs_feed_entities(url: str, headers, label: str): + feed = gtfs_realtime_pb2.FeedMessage() # type: ignore + + # TODO add timeout to requests call + response = requests.get(url, headers=headers, timeout=20) + if response.status_code == 200: + log_info([f"Successfully updated {label}", response.status_code], 0) + else: + log_error( + [ + f"Updating {label} got", + response.status_code, + response.content, + ], + 0, + ) + + feed.ParseFromString(response.content) + return feed.entity + + + +## reworked for gtfs2 + +def get_next_services(self): + self.data = self._data + self._stop = self._stop_id + self._route = self._route_id + self._direction = self._direction + _LOGGER.debug("Get Next Services, route/direction/stop: %s", self.data.get(self._route, {}).get(self._direction, {}).get(self._stop, [])) + + next_services = self.data.get(self._route, {}).get(self._direction, {}).get(self._stop, []) + if self.hass.config.time_zone is None: + _LOGGER.error("Timezone is not set in Home Assistant configuration") + timezone = "UTC" + else: + timezone=dt_util.get_time_zone(self.hass.config.time_zone) + + if self._relative : + return ( + due_in_minutes(next_services[0].arrival_time) + if len(next_services) > 0 + else "-" + ) + else: + return ( + next_services[0].arrival_time.replace(tzinfo=timezone) + if len(next_services) > 0 + else "-" + ) + +def get_rt_route_statuses(self): + + vehicle_positions = {} + + + class StopDetails: + def __init__(self, arrival_time, position): + self.arrival_time = arrival_time + self.position = position + + departure_times = {} + + feed_entities = get_gtfs_feed_entities( + url=self._trip_update_url, headers=self._headers, label="trip data" + ) + + for entity in feed_entities: + if entity.HasField("trip_update"): + # If delimiter specified split the route ID in the gtfs rt feed + log_debug( + [ + "Received Trip ID", + entity.trip_update.trip.trip_id, + "Route ID:", + entity.trip_update.trip.route_id, + "direction ID", + entity.trip_update.trip.direction_id, + "Start Time:", + entity.trip_update.trip.start_time, + "Start Date:", + entity.trip_update.trip.start_date, + ], + 1, + ) + if self._route_delimiter is not None: + route_id_split = entity.trip_update.trip.route_id.split( + self._route_delimiter + ) + if route_id_split[0] == self._route_delimiter: + route_id = entity.trip_update.trip.route_id + else: + route_id = route_id_split[0] + log_debug( + [ + "Feed Route ID", + entity.trip_update.trip.route_id, + "changed to", + route_id, + ], + 1, + ) + + else: + route_id = entity.trip_update.trip.route_id + + if route_id not in departure_times: + departure_times[route_id] = {} + + if entity.trip_update.trip.direction_id is not None: + direction_id = str(entity.trip_update.trip.direction_id) + else: + direction_id = DEFAULT_DIRECTION + if direction_id not in departure_times[route_id]: + departure_times[route_id][direction_id] = {} + + for stop in entity.trip_update.stop_time_update: + stop_id = stop.stop_id + if not departure_times[route_id][direction_id].get( + stop_id + ): + departure_times[route_id][direction_id][stop_id] = [] + # Use stop arrival time; + # fall back on departure time if not available + if stop.arrival.time == 0: + stop_time = stop.departure.time + else: + stop_time = stop.arrival.time + log_debug( + [ + "Stop:", + stop_id, + "Stop Sequence:", + stop.stop_sequence, + "Stop Time:", + stop_time, + ], + 2, + ) + # Ignore arrival times in the past + if due_in_minutes(datetime.fromtimestamp(stop_time)) >= 0: + log_debug( + [ + "Adding route ID", + route_id, + "trip ID", + entity.trip_update.trip.trip_id, + "direction ID", + entity.trip_update.trip.direction_id, + "stop ID", + stop_id, + "stop time", + stop_time, + ], + 3, + ) + + details = StopDetails( + datetime.fromtimestamp(stop_time), + vehicle_positions.get( + entity.trip_update.trip.trip_id + ), + ) + departure_times[route_id][direction_id][ + stop_id + ].append(details) + + # Sort by arrival time + for route in departure_times: + for direction in departure_times[route]: + for stop in departure_times[route][direction]: + departure_times[route][direction][stop].sort( + key=lambda t: t.arrival_time + ) + + self.info = departure_times + + return departure_times + +def get_rt_vehicle_positions(self): + positions = {} + feed_entities = get_gtfs_feed_entities( + url=self._vehicle_position_url, + headers=self._headers, + label="vehicle positions", + ) + + for entity in feed_entities: + vehicle = entity.vehicle + + if not vehicle.trip.trip_id: + # Vehicle is not in service + continue + log_debug( + [ + "Adding position for trip ID", + vehicle.trip.trip_id, + "position latitude", + vehicle.position.latitude, + "longitude", + vehicle.position.longitude, + ], + 2, + ) + + positions[vehicle.trip.trip_id] = vehicle.position + + return positions + diff --git a/custom_components/gtfs2/manifest.json b/custom_components/gtfs2/manifest.json index 9349aaf..9c94526 100644 --- a/custom_components/gtfs2/manifest.json +++ b/custom_components/gtfs2/manifest.json @@ -7,5 +7,5 @@ "iot_class": "local_polling", "issue_tracker": "https://github.com/vingerha/gtfs2/issues", "requirements": ["pygtfs==0.1.9"], - "version": "0.1.2" + "version": "0.1.5" } diff --git a/custom_components/gtfs2/sensor.py b/custom_components/gtfs2/sensor.py index 4498762..30800fd 100644 --- a/custom_components/gtfs2/sensor.py +++ b/custom_components/gtfs2/sensor.py @@ -12,6 +12,8 @@ from homeassistant.util import slugify import homeassistant.util.dt as dt_util +from .coordinator import GTFSRealtimeUpdateCoordinator + from .const import ( ATTR_ARRIVAL, ATTR_BICYCLE, @@ -64,7 +66,7 @@ async def async_setup_entry( ) -> None: """Initialize the setup.""" coordinator: GTFSUpdateCoordinator = hass.data[DOMAIN][config_entry.entry_id][ - "coordinator" + "coordinator" ] await coordinator.async_config_entry_first_refresh() @@ -74,6 +76,8 @@ async def async_setup_entry( ] async_add_entities(sensors, False) + + class GTFSDepartureSensor(CoordinatorEntity, SensorEntity): @@ -82,7 +86,6 @@ class GTFSDepartureSensor(CoordinatorEntity, SensorEntity): def __init__(self, coordinator) -> None: """Initialize the GTFSsensor.""" super().__init__(coordinator) - self._name = coordinator.data["name"] self._attributes: dict[str, Any] = {} @@ -185,7 +188,7 @@ def _update_attrs(self): # noqa: C901 PLR0911 ) self._agency = False - # Define the state as a UTC timestamp with ISO 8601 format + # Define the state as a Agency TZ, then help TZ (which is UTC if no HA TZ set) if not self._departure: self._state = None elif self._agency: @@ -198,10 +201,11 @@ def _update_attrs(self): # noqa: C901 PLR0911 ) else: _LOGGER.debug( - "Self._departure time for state value UTC: %s", + "Self._departure time from helper: %s", {self._departure["departure_time"]}, ) - self._state = self._departure["departure_time"].replace(tzinfo=dt_util.UTC) + self._state = self._departure["departure_time"] + # settin state value self._attr_native_value = self._state @@ -369,6 +373,14 @@ def _update_attrs(self): # noqa: C901 PLR0911 self._attributes["next_departures_lines"] = self._departure[ "next_departures_lines" ][:10] + + # Add next departures with their headsign + prefix = "next_departures_headsign" + self._attributes["next_departures_headsign"] = [] + if self._next_departures: + self._attributes["next_departures_headsign"] = self._departure[ + "next_departures_headsign" + ][:10] self._attributes["updated_at"] = dt_util.now().replace(tzinfo=None) self._attr_extra_state_attributes = self._attributes @@ -398,3 +410,38 @@ def remove_keys(self, prefix: str) -> None: self._attributes = { k: v for k, v in self._attributes.items() if not k.startswith(prefix) } + + +class GTFSRealtimeDepartureSensor(CoordinatorEntity): + """Implementation of a GTFS departure sensor.""" + + def __init__(self, coordinator: GTFSRealtimeUpdateCoordinator) -> None: + """Initialize the GTFSsensor.""" + super().__init__(coordinator) + self._name = coordinator.data["name"] + "_rt" + self._attributes: dict[str, Any] = {} + + self._attr_unique_id = f"gtfs-{self._name}_rt" + self._attr_device_info = DeviceInfo( + name=f"GTFS - {self._name}", + entry_type=DeviceEntryType.SERVICE, + identifiers={(DOMAIN, f"GTFS - {self._name}_rt")}, + manufacturer="GTFS", + model=self._name, + ) + _LOGGER.debug("GTFS RT Sensor: coordinator data: %s", coordinator.data ) + self._coordinator = coordinator + self._attributes = self._update_attrs_rt() + self._attr_extra_state_attributes = self._attributes + + @callback + def _handle_coordinator_update(self) -> None: + """Handle updated data from the coordinator.""" + self._update_attrs_rt() + super()._handle_coordinator_update() + + def _update_attrs_rt(self): # noqa: C901 PLR0911 + _LOGGER.debug(f"GTFS RT Sensor update attr DATA: {self._coordinator.data}") + self._attr_native_value = coordinator.data + self._attributes["next_departure_realtime"] = self._coordinator.data + return self._attributes \ No newline at end of file diff --git a/custom_components/gtfs2/strings.json b/custom_components/gtfs2/strings.json index 3b2aeac..2426849 100644 --- a/custom_components/gtfs2/strings.json +++ b/custom_components/gtfs2/strings.json @@ -12,7 +12,7 @@ "file": "New datasource name", "url": "external url to gtfs data file" }, - "description": "NOTE: with a new url: this may take minutes after submit" + "description": "NOTE: with a new url/zip, this may take a long time after submit" }, "remove": { "data": { @@ -43,12 +43,31 @@ "unknown": "[%key:common::config_flow::error::unknown%]", "stop_incorrect": "Start and/or End destination probably incorrect, check logs", "generic_failure": "Overall failure, check logs", - "no_data_file": "Data collection issue: URL incorrect or filename not in the correct folder" + "no_data_file": "Data collection issue: URL incorrect or filename not in the correct folder", + "no_zip_file": "Data collection issue: ZIP file not in the correct folder" }, "abort": { "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", "files_deleted": "Datasource deleted, this may affect existing routes", - "stop_incorrect": "Start and/or End destination incorrect, possibly not in same direction, check logs" + "stop_incorrect": "Start and/or End destination incorrect, possibly not in same direction, check logs", + "no_zip_file": "Data collection issue: ZIP file not in the correct folder" + } + }, + "options": { + "step": { + "init": { + "description": "Customize the way the integration works", + "data": { + "refresh_interval": "Data refresh interval (in minutes)" + } + }, + "real_time": { + "description": "Provide url to real time API", + "data": { + "trip_update_url": "URL to trip data", + "vehicle_position_url": "URL to vehicle position (can be the same as trip data)" + } + } } } } diff --git a/custom_components/gtfs2/translations/en.json b/custom_components/gtfs2/translations/en.json index 8654ce0..66d1da6 100644 --- a/custom_components/gtfs2/translations/en.json +++ b/custom_components/gtfs2/translations/en.json @@ -12,7 +12,7 @@ "file": "New datasource name", "url": "external url to gtfs data (zip) file" }, - "description": "NOTE: with a new url, this may take quite a bit of time, \n depending on file-size and system performance" + "description": "NOTE: with a new url/zip, this may take quite a bit of time, \n depending on file-size and system performance" }, "route": { "data": { @@ -37,12 +37,32 @@ "unknown": "[%key:common::config_flow::error::unknown%]", "stop_incorrect": "Start and/or End destination incorrect, possibly no transport 'today' or not in same direction, check logs", "generic_failure": "Overall failure, check logs", - "no_data_file": "Data collection issue: URL incorrect or filename not in the correct folder" + "no_data_file": "Data collection issue: URL incorrect or filename not in the correct folder", + "no_zip_file": "Data collection issue: ZIP file not in the correct folder" }, "abort": { "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", "files_deleted": "Datasource deleted, this may affect existing routes", - "stop_incorrect": "Start and/or End destination incorrect, possibly no transport 'today' or not in same direction, check logs" + "stop_incorrect": "Start and/or End destination incorrect, possibly no transport 'today' or not in same direction, check logs", + "no_data_file": "Data collection issue: URL incorrect or filename not in the correct folder", + "no_zip_file": "Data collection issue: ZIP file not existing in the correct folder, note that it is capital-sensitive" + } + }, + "options": { + "step": { + "init": { + "description": "Customize the way the integration works", + "data": { + "refresh_interval": "Data refresh interval (in minutes)" + } + }, + "real_time": { + "description": "Provide url to real time API", + "data": { + "trip_update_url": "URL to trip data", + "vehicle_position_url": "URL to vehicle position (can be the same as trip data)" + } + } } } } diff --git a/custom_components/gtfs2/translations/fr.json b/custom_components/gtfs2/translations/fr.json index 4fb1911..72f02bc 100644 --- a/custom_components/gtfs2/translations/fr.json +++ b/custom_components/gtfs2/translations/fr.json @@ -12,7 +12,7 @@ "file": "Nom de la nouvelle source de données", "url": "URL externe vers le fichier (zip) des données GTFS" }, - "description": "REMARQUE : avec une nouvelle URL, cela peut prendre du temps après la soumission, selon la taille du fichier et performance du serveur" + "description": "REMARQUE: avec une nouvelle URL/zip, cela peut prendre du temps après la soumission, \n selon la taille du fichier et performance du serveur" }, "route": { "data": { @@ -37,12 +37,25 @@ "unknown": "[%key:common::config_flow::error::unknown%]", "stop_incorrect": "Arrêt de départ et/ou de fin incorrecte, éventuellement pas de transport « aujourd'hui » ou pas dans la même direction, vérifiez les logs d'érreur", "generic_failure": "Échec global, vérifiez les logs d'érreur", - "no_data_file": "Problème de collecte de données : URL incorrecte ou nom de fichier ne se trouvant pas dans le bon dossier" + "no_data_file": "Problème de collecte de données : URL incorrecte ou nom de fichier ne se trouve pas dans le bon dossier", + "no_zip_file": "Problème de collecte de données : fichier ZIP ne se trouve pas dans le bon dossier" }, "abort": { "already_configured": "[%key:common::config_flow::abort::already_configured_device%]", "files_deleted": "Source de données supprimée, cela peut affecter les itinéraires existants", - "stop_incorrect": "Arrêt de départ et/ou de fin incorrecte, éventuellement pas de transport « aujourd'hui » ou pas dans la même direction, vérifiez logs d'érreur" + "stop_incorrect": "Arrêt de départ et/ou de fin incorrecte, éventuellement pas de transport « aujourd'hui » ou pas dans la même direction, vérifiez logs d'érreur", + "no_data_file": "Problème de collecte de données : URL incorrecte ou nom de fichier ne se trouve pas dans le bon dossier", + "no_zip_file": "Problème de collecte de données : fichier ZIP ne se trouve pas dans le bon dossier, note: sensible à la casse" + } + }, + "options": { + "step": { + "init": { + "description": "Personnalisez le fonctionnement de l'intégration", + "data": { + "refresh_interval": "Personnalisez le fonctionnement de l'intégration" + } + } } } } \ No newline at end of file From dda12cc4c58e7103c3423fa600579b092f40040e Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 10:04:35 +0100 Subject: [PATCH 09/18] Update README.md --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index b732874..e11bda0 100644 --- a/README.md +++ b/README.md @@ -18,10 +18,10 @@ Core GTFS uses start + stop, it then determines every option between them and pr ## Updates 20231110: adding features: -- timezone check is now in order: agency (deliverung data), if not > HA system, if not > UTC -- attribute next_departure_headsigns -- adding route shortname in selection/list +- new attribute: next_departure_headsigns +- adding route shortname in selection/list to overcome data discrepancies been short name and long name - for new datasource, allow to use a self-placed zip file in the gtfs2 folder. This for zip that are not available via URL or zip with data that may need modification to comply with extraction conditions by pygtfs +- timezone fo next_departure is now used in order: agency (delivering data), if not > HA system, if not > UTC. This to resolve TZ issues for datasets without agency (timezone) 20231104: initial version From b9fe27ca705670e7758802379f695b146bcb9259 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 10:05:13 +0100 Subject: [PATCH 10/18] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index e11bda0..ff8c4a9 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,7 @@ Core GTFS uses start + stop, it then determines every option between them and pr - new attribute: next_departure_headsigns - adding route shortname in selection/list to overcome data discrepancies been short name and long name - for new datasource, allow to use a self-placed zip file in the gtfs2 folder. This for zip that are not available via URL or zip with data that may need modification to comply with extraction conditions by pygtfs -- timezone fo next_departure is now used in order: agency (delivering data), if not > HA system, if not > UTC. This to resolve TZ issues for datasets without agency (timezone) +- timezone for next_departure is now used in order: agency (delivering data), if not > HA system, if not > UTC. This to resolve TZ issues for datasets without agency (timezone) 20231104: initial version From 9bbc70df0e3034cd503de5f07fc1198b9c02acd0 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 15:13:58 +0100 Subject: [PATCH 11/18] Draft release adding realtime --- custom_components/gtfs2/__init__.py | 7 +- custom_components/gtfs2/config_flow.py | 3 +- custom_components/gtfs2/coordinator.py | 141 ++++++++++------------ custom_components/gtfs2/gtfs_helper.py | 1 + custom_components/gtfs2/gtfs_rt_helper.py | 2 +- custom_components/gtfs2/manifest.json | 2 +- custom_components/gtfs2/sensor.py | 49 ++------ 7 files changed, 88 insertions(+), 117 deletions(-) diff --git a/custom_components/gtfs2/__init__.py b/custom_components/gtfs2/__init__.py index fa688c5..b0e21e7 100644 --- a/custom_components/gtfs2/__init__.py +++ b/custom_components/gtfs2/__init__.py @@ -8,7 +8,7 @@ from datetime import timedelta from .const import DOMAIN, PLATFORMS, DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL -from .coordinator import GTFSUpdateCoordinator, GTFSRealtimeUpdateCoordinator +from .coordinator import GTFSUpdateCoordinator import voluptuous as vol from .gtfs_helper import get_gtfs @@ -40,6 +40,9 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: #await coordinator.async_config_entry_first_refresh() + if not coordinator.last_update_success: + raise ConfigEntryNotReady + hass.data[DOMAIN][entry.entry_id] = { "coordinator": coordinator, } @@ -74,6 +77,6 @@ def update_gtfs(call): async def update_listener(hass: HomeAssistant, entry: ConfigEntry): """Handle options update.""" - hass.data[DOMAIN][entry.entry_id]['coordinator'].update_interval = timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)) + hass.data[DOMAIN][entry.entry_id]['coordinator'].update_interval = timedelta(minutes=1) return True \ No newline at end of file diff --git a/custom_components/gtfs2/config_flow.py b/custom_components/gtfs2/config_flow.py index aeef210..90bed8c 100644 --- a/custom_components/gtfs2/config_flow.py +++ b/custom_components/gtfs2/config_flow.py @@ -246,7 +246,6 @@ async def async_step_init( ) -> FlowResult: """Manage the options.""" if user_input is not None: - user_input['real_time'] = False if user_input['real_time']: self._user_inputs.update(user_input) _LOGGER.debug(f"GTFS Options with realtime: {self._user_inputs}") @@ -261,7 +260,7 @@ async def async_step_init( data_schema=vol.Schema( { vol.Optional("refresh_interval", default=self.config_entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)): int, -# vol.Required("real_time"): vol.In({False: "No", True: "Yes"}), + vol.Required("real_time"): vol.In({False: "No", True: "Yes"}), } ), ) diff --git a/custom_components/gtfs2/coordinator.py b/custom_components/gtfs2/coordinator.py index c04f29a..a9fbb1c 100644 --- a/custom_components/gtfs2/coordinator.py +++ b/custom_components/gtfs2/coordinator.py @@ -4,9 +4,11 @@ from datetime import timedelta import logging + from homeassistant.config_entries import ConfigEntry from homeassistant.core import HomeAssistant from homeassistant.helpers.update_coordinator import DataUpdateCoordinator +import homeassistant.util.dt as dt_util from .const import DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL from .gtfs_helper import get_gtfs, get_next_departure, check_datasource_index @@ -26,7 +28,7 @@ def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: hass=hass, logger=_LOGGER, name=entry.entry_id, - update_interval=timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)), + update_interval=timedelta(minutes=1), ) self.config_entry = entry self.hass = hass @@ -35,87 +37,78 @@ def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: self._data: dict[str, str] = {} async def _async_update_data(self) -> dict[str, str]: - """Update.""" + """Get the latest data from GTFS and GTFS relatime, depending refresh interval""" data = self.config_entry.data options = self.config_entry.options self._pygtfs = get_gtfs( self.hass, DEFAULT_PATH, data, False ) - self._data = { - "schedule": self._pygtfs, - "origin": data["origin"].split(": ")[0], - "destination": data["destination"].split(": ")[0], - "offset": data["offset"], - "include_tomorrow": data["include_tomorrow"], - "gtfs_dir": DEFAULT_PATH, - "name": data["name"], - } + previous_data = None if self.data is None else self.data.copy() + _LOGGER.debug("Previous data: %s", previous_data) - check_index = await self.hass.async_add_executor_job( - check_datasource_index, self._pygtfs - ) - - try: - self._data["next_departure"] = await self.hass.async_add_executor_job( - get_next_departure, self - ) - except Exception as ex: # pylint: disable=broad-except - _LOGGER.error("Error getting gtfs data from generic helper: %s", ex) - _LOGGER.debug("GTFS coordinator data from helper: %s", self._data["next_departure"]) - return self._data + if previous_data is not None and (previous_data["next_departure"]["gtfs_updated_at"] + timedelta(minutes=options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL))) > dt_util.now().replace(tzinfo=None): + _LOGGER.debug("Do nothing") + self._data = previous_data + + if previous_data is None or (previous_data["next_departure"]["gtfs_updated_at"] + timedelta(minutes=options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL))) < dt_util.now().replace(tzinfo=None): + self._data = { + "schedule": self._pygtfs, + "origin": data["origin"].split(": ")[0], + "destination": data["destination"].split(": ")[0], + "offset": data["offset"], + "include_tomorrow": data["include_tomorrow"], + "gtfs_dir": DEFAULT_PATH, + "name": data["name"], + } -class GTFSRealtimeUpdateCoordinator(DataUpdateCoordinator): - """Data update coordinator for the GTFSRT integration.""" - - config_entry: ConfigEntry - - - def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: - """Initialize the coordinator.""" - _LOGGER.debug("GTFS RT: coordinator init") - super().__init__( - hass=hass, - logger=_LOGGER, - name=entry.entry_id, - update_interval=timedelta(minutes=entry.options.get("refresh_interval_rt", DEFAULT_REFRESH_INTERVAL_RT)), - ) - self.config_entry = entry - self.hass = hass - self._data: dict[str, str] = {} - - async def _async_update_data(self) -> dict[str, str]: - """Update.""" - data = self.config_entry.data - options = self.config_entry.options - _LOGGER.debug("GTFS RT: coordinator async_update_data: %s", data) - _LOGGER.debug("GTFS RT: coordinator async_update_data options: %s", options) - #add real_time if setup - + check_index = await self.hass.async_add_executor_job( + check_datasource_index, self._pygtfs + ) + + try: + self._data["next_departure"] = await self.hass.async_add_executor_job( + get_next_departure, self + ) + except Exception as ex: # pylint: disable=broad-except + _LOGGER.error("Error getting gtfs data from generic helper: %s", ex) + return None + _LOGGER.debug("GTFS coordinator data from helper: %s", self._data["next_departure"]) + if "real_time" in options: - - """Initialize the info object.""" - self._trip_update_url = options["trip_update_url"] - self._vehicle_position_url = options["vehicle_position_url"] - self._route_delimiter = "-" -# if options["CONF_API_KEY"] is not None: -# self._headers = {"Authorization": options["CONF_API_KEY"]} -# elif options["CONF_X_API_KEY"] is not None: -# self._headers = {"x-api-key": options["CONF_X_API_KEY"]} -# else: -# self._headers = None - self._headers = None - self.info = {} - self._route_id = data["route"].split(": ")[0] - self._stop_id = data["origin"].split(": ")[0] - self._direction = data["direction"] - self._relative = False - #_LOGGER.debug("GTFS RT: Realtime data: %s", self._data) - self._data = await self.hass.async_add_executor_job(get_rt_route_statuses, self) - self._get_next_service = await self.hass.async_add_executor_job(get_next_services, self) - _LOGGER.debug("GTFS RT: Realtime next service: %s", self._get_next_service) + if options["real_time"]: + + """Initialize the info object.""" + self._trip_update_url = options["trip_update_url"] + self._vehicle_position_url = options["vehicle_position_url"] + self._route_delimiter = "-" + # if options["CONF_API_KEY"] is not None: + # self._headers = {"Authorization": options["CONF_API_KEY"]} + # elif options["CONF_X_API_KEY"] is not None: + # self._headers = {"x-api-key": options["CONF_X_API_KEY"]} + # else: + # self._headers = None + self._headers = None + self.info = {} + self._route_id = data["route"].split(": ")[0] + self._stop_id = data["origin"].split(": ")[0] + self._direction = data["direction"] + self._relative = False + #_LOGGER.debug("GTFS RT: Realtime data: %s", self._data) + try: + self._get_rt_route_statuses = await self.hass.async_add_executor_job(get_rt_route_statuses, self) + self._get_next_service = await self.hass.async_add_executor_job(get_next_services, self) + except Exception as ex: # pylint: disable=broad-except + _LOGGER.error("Error getting gtfs realtime data: %s", ex) + self._get_next_service = "error" + else: + _LOGGER.info("GTFS RT: RealTime = false, selected in entity options") + self._get_next_service = "n.a." else: - _LOGGER.error("GTFS RT: Issue with entity options") - return "---" + _LOGGER.debug("GTFS RT: RealTime not selected in entity options") + self._get_next_service = "n.a." + self._data["next_departure"]["next_departure_realtime"] = self._get_next_service + self._data["next_departure"]["gtfs_rt_updated_at"] = dt_util.now().replace(tzinfo=None) + + return self._data - return self._get_next_service diff --git a/custom_components/gtfs2/gtfs_helper.py b/custom_components/gtfs2/gtfs_helper.py index e2b0a67..331ecd1 100644 --- a/custom_components/gtfs2/gtfs_helper.py +++ b/custom_components/gtfs2/gtfs_helper.py @@ -298,6 +298,7 @@ def get_next_departure(self): "next_departures": timetable_remaining, "next_departures_lines": timetable_remaining_line, "next_departures_headsign": timetable_remaining_headsign, + "gtfs_updated_at": dt_util.now().replace(tzinfo=None), } diff --git a/custom_components/gtfs2/gtfs_rt_helper.py b/custom_components/gtfs2/gtfs_rt_helper.py index b8b83b0..5d9940d 100644 --- a/custom_components/gtfs2/gtfs_rt_helper.py +++ b/custom_components/gtfs2/gtfs_rt_helper.py @@ -122,7 +122,7 @@ def get_gtfs_feed_entities(url: str, headers, label: str): ## reworked for gtfs2 def get_next_services(self): - self.data = self._data + self.data = self._get_rt_route_statuses self._stop = self._stop_id self._route = self._route_id self._direction = self._direction diff --git a/custom_components/gtfs2/manifest.json b/custom_components/gtfs2/manifest.json index 9c94526..8dbe4c3 100644 --- a/custom_components/gtfs2/manifest.json +++ b/custom_components/gtfs2/manifest.json @@ -6,6 +6,6 @@ "documentation": "https://github.com/vingerha/gtfs2", "iot_class": "local_polling", "issue_tracker": "https://github.com/vingerha/gtfs2/issues", - "requirements": ["pygtfs==0.1.9"], + "requirements": ["pygtfs==0.1.9","gtfs-realtime-bindings==1.0.0"], "version": "0.1.5" } diff --git a/custom_components/gtfs2/sensor.py b/custom_components/gtfs2/sensor.py index 30800fd..102120c 100644 --- a/custom_components/gtfs2/sensor.py +++ b/custom_components/gtfs2/sensor.py @@ -12,8 +12,6 @@ from homeassistant.util import slugify import homeassistant.util.dt as dt_util -from .coordinator import GTFSRealtimeUpdateCoordinator - from .const import ( ATTR_ARRIVAL, ATTR_BICYCLE, @@ -381,8 +379,18 @@ def _update_attrs(self): # noqa: C901 PLR0911 self._attributes["next_departures_headsign"] = self._departure[ "next_departures_headsign" ][:10] - - self._attributes["updated_at"] = dt_util.now().replace(tzinfo=None) + + # Add next departure realtime + self._attributes["next_departure_realtime"] = self._departure[ + "next_departure_realtime" + ] + self._attributes["gtfs_rt_updated_at"] = self._departure[ + "gtfs_rt_updated_at" + ] + + self._attributes["gtfs_updated_at"] = self._departure[ + "gtfs_updated_at" + ] self._attr_extra_state_attributes = self._attributes return self._attr_extra_state_attributes @@ -412,36 +420,3 @@ def remove_keys(self, prefix: str) -> None: } -class GTFSRealtimeDepartureSensor(CoordinatorEntity): - """Implementation of a GTFS departure sensor.""" - - def __init__(self, coordinator: GTFSRealtimeUpdateCoordinator) -> None: - """Initialize the GTFSsensor.""" - super().__init__(coordinator) - self._name = coordinator.data["name"] + "_rt" - self._attributes: dict[str, Any] = {} - - self._attr_unique_id = f"gtfs-{self._name}_rt" - self._attr_device_info = DeviceInfo( - name=f"GTFS - {self._name}", - entry_type=DeviceEntryType.SERVICE, - identifiers={(DOMAIN, f"GTFS - {self._name}_rt")}, - manufacturer="GTFS", - model=self._name, - ) - _LOGGER.debug("GTFS RT Sensor: coordinator data: %s", coordinator.data ) - self._coordinator = coordinator - self._attributes = self._update_attrs_rt() - self._attr_extra_state_attributes = self._attributes - - @callback - def _handle_coordinator_update(self) -> None: - """Handle updated data from the coordinator.""" - self._update_attrs_rt() - super()._handle_coordinator_update() - - def _update_attrs_rt(self): # noqa: C901 PLR0911 - _LOGGER.debug(f"GTFS RT Sensor update attr DATA: {self._coordinator.data}") - self._attr_native_value = coordinator.data - self._attributes["next_departure_realtime"] = self._coordinator.data - return self._attributes \ No newline at end of file From e515bda37e2c45d625b67689ed252dfb49d7fd0e Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Fri, 10 Nov 2023 15:21:31 +0100 Subject: [PATCH 12/18] Add migration components --- custom_components/gtfs2/__init__.py | 23 ++++++++++++++++++----- custom_components/gtfs2/config_flow.py | 2 +- 2 files changed, 19 insertions(+), 6 deletions(-) diff --git a/custom_components/gtfs2/__init__.py b/custom_components/gtfs2/__init__.py index b0e21e7..4e71276 100644 --- a/custom_components/gtfs2/__init__.py +++ b/custom_components/gtfs2/__init__.py @@ -20,12 +20,25 @@ async def async_migrate_entry(hass, config_entry: ConfigEntry) -> bool: if config_entry.version == 1: - new = {**config_entry.data} - new['extract_from'] = 'url' - new.pop('refresh_interval') - - config_entry.version = 2 + new_data = {**config_entry.data} + new_data['extract_from'] = 'url' + new_data.pop('refresh_interval') + + new_options = {**config_entry.options} + new_options['real_time'] = False + new_options['refresh_interval'] = 15 + + config_entry.version = 3 hass.config_entries.async_update_entry(config_entry, data=new) + hass.config_entries.async_update_entry(config_entry, options=new_options) + + if config_entry.version == 2: + + new_options = {**config_entry.options} + new_options['real_time'] = False + + config_entry.version = 3 + hass.config_entries.async_update_entry(config_entry, options=new_options) _LOGGER.debug("Migration to version %s successful", config_entry.version) diff --git a/custom_components/gtfs2/config_flow.py b/custom_components/gtfs2/config_flow.py index 90bed8c..1d56101 100644 --- a/custom_components/gtfs2/config_flow.py +++ b/custom_components/gtfs2/config_flow.py @@ -34,7 +34,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN): """Handle a config flow for GTFS.""" - VERSION = 2 + VERSION = 3 def __init__(self) -> None: """Init ConfigFlow.""" From d69b47ed12418d84ce93c9a5aea8eef77b5cf14f Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Sat, 11 Nov 2023 07:29:21 +0100 Subject: [PATCH 13/18] Align, back to no realtime --- custom_components/gtfs2/__init__.py | 30 ++--- custom_components/gtfs2/config_flow.py | 5 +- custom_components/gtfs2/coordinator.py | 141 ++++++++++++---------- custom_components/gtfs2/gtfs_helper.py | 1 - custom_components/gtfs2/gtfs_rt_helper.py | 2 +- custom_components/gtfs2/manifest.json | 2 +- custom_components/gtfs2/sensor.py | 49 ++++++-- 7 files changed, 123 insertions(+), 107 deletions(-) diff --git a/custom_components/gtfs2/__init__.py b/custom_components/gtfs2/__init__.py index 4e71276..fa688c5 100644 --- a/custom_components/gtfs2/__init__.py +++ b/custom_components/gtfs2/__init__.py @@ -8,7 +8,7 @@ from datetime import timedelta from .const import DOMAIN, PLATFORMS, DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL -from .coordinator import GTFSUpdateCoordinator +from .coordinator import GTFSUpdateCoordinator, GTFSRealtimeUpdateCoordinator import voluptuous as vol from .gtfs_helper import get_gtfs @@ -20,25 +20,12 @@ async def async_migrate_entry(hass, config_entry: ConfigEntry) -> bool: if config_entry.version == 1: - new_data = {**config_entry.data} - new_data['extract_from'] = 'url' - new_data.pop('refresh_interval') - - new_options = {**config_entry.options} - new_options['real_time'] = False - new_options['refresh_interval'] = 15 - - config_entry.version = 3 - hass.config_entries.async_update_entry(config_entry, data=new) - hass.config_entries.async_update_entry(config_entry, options=new_options) - - if config_entry.version == 2: - - new_options = {**config_entry.options} - new_options['real_time'] = False + new = {**config_entry.data} + new['extract_from'] = 'url' + new.pop('refresh_interval') - config_entry.version = 3 - hass.config_entries.async_update_entry(config_entry, options=new_options) + config_entry.version = 2 + hass.config_entries.async_update_entry(config_entry, data=new) _LOGGER.debug("Migration to version %s successful", config_entry.version) @@ -53,9 +40,6 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool: #await coordinator.async_config_entry_first_refresh() - if not coordinator.last_update_success: - raise ConfigEntryNotReady - hass.data[DOMAIN][entry.entry_id] = { "coordinator": coordinator, } @@ -90,6 +74,6 @@ def update_gtfs(call): async def update_listener(hass: HomeAssistant, entry: ConfigEntry): """Handle options update.""" - hass.data[DOMAIN][entry.entry_id]['coordinator'].update_interval = timedelta(minutes=1) + hass.data[DOMAIN][entry.entry_id]['coordinator'].update_interval = timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)) return True \ No newline at end of file diff --git a/custom_components/gtfs2/config_flow.py b/custom_components/gtfs2/config_flow.py index 1d56101..aeef210 100644 --- a/custom_components/gtfs2/config_flow.py +++ b/custom_components/gtfs2/config_flow.py @@ -34,7 +34,7 @@ class ConfigFlow(config_entries.ConfigFlow, domain=DOMAIN): """Handle a config flow for GTFS.""" - VERSION = 3 + VERSION = 2 def __init__(self) -> None: """Init ConfigFlow.""" @@ -246,6 +246,7 @@ async def async_step_init( ) -> FlowResult: """Manage the options.""" if user_input is not None: + user_input['real_time'] = False if user_input['real_time']: self._user_inputs.update(user_input) _LOGGER.debug(f"GTFS Options with realtime: {self._user_inputs}") @@ -260,7 +261,7 @@ async def async_step_init( data_schema=vol.Schema( { vol.Optional("refresh_interval", default=self.config_entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)): int, - vol.Required("real_time"): vol.In({False: "No", True: "Yes"}), +# vol.Required("real_time"): vol.In({False: "No", True: "Yes"}), } ), ) diff --git a/custom_components/gtfs2/coordinator.py b/custom_components/gtfs2/coordinator.py index a9fbb1c..c04f29a 100644 --- a/custom_components/gtfs2/coordinator.py +++ b/custom_components/gtfs2/coordinator.py @@ -4,11 +4,9 @@ from datetime import timedelta import logging - from homeassistant.config_entries import ConfigEntry from homeassistant.core import HomeAssistant from homeassistant.helpers.update_coordinator import DataUpdateCoordinator -import homeassistant.util.dt as dt_util from .const import DEFAULT_PATH, DEFAULT_REFRESH_INTERVAL from .gtfs_helper import get_gtfs, get_next_departure, check_datasource_index @@ -28,7 +26,7 @@ def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: hass=hass, logger=_LOGGER, name=entry.entry_id, - update_interval=timedelta(minutes=1), + update_interval=timedelta(minutes=entry.options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL)), ) self.config_entry = entry self.hass = hass @@ -37,78 +35,87 @@ def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: self._data: dict[str, str] = {} async def _async_update_data(self) -> dict[str, str]: - """Get the latest data from GTFS and GTFS relatime, depending refresh interval""" + """Update.""" data = self.config_entry.data options = self.config_entry.options self._pygtfs = get_gtfs( self.hass, DEFAULT_PATH, data, False ) - previous_data = None if self.data is None else self.data.copy() - _LOGGER.debug("Previous data: %s", previous_data) + self._data = { + "schedule": self._pygtfs, + "origin": data["origin"].split(": ")[0], + "destination": data["destination"].split(": ")[0], + "offset": data["offset"], + "include_tomorrow": data["include_tomorrow"], + "gtfs_dir": DEFAULT_PATH, + "name": data["name"], + } - if previous_data is not None and (previous_data["next_departure"]["gtfs_updated_at"] + timedelta(minutes=options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL))) > dt_util.now().replace(tzinfo=None): - _LOGGER.debug("Do nothing") - self._data = previous_data - - if previous_data is None or (previous_data["next_departure"]["gtfs_updated_at"] + timedelta(minutes=options.get("refresh_interval", DEFAULT_REFRESH_INTERVAL))) < dt_util.now().replace(tzinfo=None): - self._data = { - "schedule": self._pygtfs, - "origin": data["origin"].split(": ")[0], - "destination": data["destination"].split(": ")[0], - "offset": data["offset"], - "include_tomorrow": data["include_tomorrow"], - "gtfs_dir": DEFAULT_PATH, - "name": data["name"], - } + check_index = await self.hass.async_add_executor_job( + check_datasource_index, self._pygtfs + ) + + try: + self._data["next_departure"] = await self.hass.async_add_executor_job( + get_next_departure, self + ) + except Exception as ex: # pylint: disable=broad-except + _LOGGER.error("Error getting gtfs data from generic helper: %s", ex) + _LOGGER.debug("GTFS coordinator data from helper: %s", self._data["next_departure"]) + return self._data - check_index = await self.hass.async_add_executor_job( - check_datasource_index, self._pygtfs - ) - - try: - self._data["next_departure"] = await self.hass.async_add_executor_job( - get_next_departure, self - ) - except Exception as ex: # pylint: disable=broad-except - _LOGGER.error("Error getting gtfs data from generic helper: %s", ex) - return None - _LOGGER.debug("GTFS coordinator data from helper: %s", self._data["next_departure"]) - +class GTFSRealtimeUpdateCoordinator(DataUpdateCoordinator): + """Data update coordinator for the GTFSRT integration.""" + + config_entry: ConfigEntry + + + def __init__(self, hass: HomeAssistant, entry: ConfigEntry) -> None: + """Initialize the coordinator.""" + _LOGGER.debug("GTFS RT: coordinator init") + super().__init__( + hass=hass, + logger=_LOGGER, + name=entry.entry_id, + update_interval=timedelta(minutes=entry.options.get("refresh_interval_rt", DEFAULT_REFRESH_INTERVAL_RT)), + ) + self.config_entry = entry + self.hass = hass + self._data: dict[str, str] = {} + + async def _async_update_data(self) -> dict[str, str]: + """Update.""" + data = self.config_entry.data + options = self.config_entry.options + _LOGGER.debug("GTFS RT: coordinator async_update_data: %s", data) + _LOGGER.debug("GTFS RT: coordinator async_update_data options: %s", options) + #add real_time if setup + if "real_time" in options: - if options["real_time"]: - - """Initialize the info object.""" - self._trip_update_url = options["trip_update_url"] - self._vehicle_position_url = options["vehicle_position_url"] - self._route_delimiter = "-" - # if options["CONF_API_KEY"] is not None: - # self._headers = {"Authorization": options["CONF_API_KEY"]} - # elif options["CONF_X_API_KEY"] is not None: - # self._headers = {"x-api-key": options["CONF_X_API_KEY"]} - # else: - # self._headers = None - self._headers = None - self.info = {} - self._route_id = data["route"].split(": ")[0] - self._stop_id = data["origin"].split(": ")[0] - self._direction = data["direction"] - self._relative = False - #_LOGGER.debug("GTFS RT: Realtime data: %s", self._data) - try: - self._get_rt_route_statuses = await self.hass.async_add_executor_job(get_rt_route_statuses, self) - self._get_next_service = await self.hass.async_add_executor_job(get_next_services, self) - except Exception as ex: # pylint: disable=broad-except - _LOGGER.error("Error getting gtfs realtime data: %s", ex) - self._get_next_service = "error" - else: - _LOGGER.info("GTFS RT: RealTime = false, selected in entity options") - self._get_next_service = "n.a." - else: - _LOGGER.debug("GTFS RT: RealTime not selected in entity options") - self._get_next_service = "n.a." - self._data["next_departure"]["next_departure_realtime"] = self._get_next_service - self._data["next_departure"]["gtfs_rt_updated_at"] = dt_util.now().replace(tzinfo=None) - return self._data + """Initialize the info object.""" + self._trip_update_url = options["trip_update_url"] + self._vehicle_position_url = options["vehicle_position_url"] + self._route_delimiter = "-" +# if options["CONF_API_KEY"] is not None: +# self._headers = {"Authorization": options["CONF_API_KEY"]} +# elif options["CONF_X_API_KEY"] is not None: +# self._headers = {"x-api-key": options["CONF_X_API_KEY"]} +# else: +# self._headers = None + self._headers = None + self.info = {} + self._route_id = data["route"].split(": ")[0] + self._stop_id = data["origin"].split(": ")[0] + self._direction = data["direction"] + self._relative = False + #_LOGGER.debug("GTFS RT: Realtime data: %s", self._data) + self._data = await self.hass.async_add_executor_job(get_rt_route_statuses, self) + self._get_next_service = await self.hass.async_add_executor_job(get_next_services, self) + _LOGGER.debug("GTFS RT: Realtime next service: %s", self._get_next_service) + else: + _LOGGER.error("GTFS RT: Issue with entity options") + return "---" + return self._get_next_service diff --git a/custom_components/gtfs2/gtfs_helper.py b/custom_components/gtfs2/gtfs_helper.py index 331ecd1..e2b0a67 100644 --- a/custom_components/gtfs2/gtfs_helper.py +++ b/custom_components/gtfs2/gtfs_helper.py @@ -298,7 +298,6 @@ def get_next_departure(self): "next_departures": timetable_remaining, "next_departures_lines": timetable_remaining_line, "next_departures_headsign": timetable_remaining_headsign, - "gtfs_updated_at": dt_util.now().replace(tzinfo=None), } diff --git a/custom_components/gtfs2/gtfs_rt_helper.py b/custom_components/gtfs2/gtfs_rt_helper.py index 5d9940d..b8b83b0 100644 --- a/custom_components/gtfs2/gtfs_rt_helper.py +++ b/custom_components/gtfs2/gtfs_rt_helper.py @@ -122,7 +122,7 @@ def get_gtfs_feed_entities(url: str, headers, label: str): ## reworked for gtfs2 def get_next_services(self): - self.data = self._get_rt_route_statuses + self.data = self._data self._stop = self._stop_id self._route = self._route_id self._direction = self._direction diff --git a/custom_components/gtfs2/manifest.json b/custom_components/gtfs2/manifest.json index 8dbe4c3..9c94526 100644 --- a/custom_components/gtfs2/manifest.json +++ b/custom_components/gtfs2/manifest.json @@ -6,6 +6,6 @@ "documentation": "https://github.com/vingerha/gtfs2", "iot_class": "local_polling", "issue_tracker": "https://github.com/vingerha/gtfs2/issues", - "requirements": ["pygtfs==0.1.9","gtfs-realtime-bindings==1.0.0"], + "requirements": ["pygtfs==0.1.9"], "version": "0.1.5" } diff --git a/custom_components/gtfs2/sensor.py b/custom_components/gtfs2/sensor.py index 102120c..30800fd 100644 --- a/custom_components/gtfs2/sensor.py +++ b/custom_components/gtfs2/sensor.py @@ -12,6 +12,8 @@ from homeassistant.util import slugify import homeassistant.util.dt as dt_util +from .coordinator import GTFSRealtimeUpdateCoordinator + from .const import ( ATTR_ARRIVAL, ATTR_BICYCLE, @@ -379,18 +381,8 @@ def _update_attrs(self): # noqa: C901 PLR0911 self._attributes["next_departures_headsign"] = self._departure[ "next_departures_headsign" ][:10] - - # Add next departure realtime - self._attributes["next_departure_realtime"] = self._departure[ - "next_departure_realtime" - ] - self._attributes["gtfs_rt_updated_at"] = self._departure[ - "gtfs_rt_updated_at" - ] - - self._attributes["gtfs_updated_at"] = self._departure[ - "gtfs_updated_at" - ] + + self._attributes["updated_at"] = dt_util.now().replace(tzinfo=None) self._attr_extra_state_attributes = self._attributes return self._attr_extra_state_attributes @@ -420,3 +412,36 @@ def remove_keys(self, prefix: str) -> None: } +class GTFSRealtimeDepartureSensor(CoordinatorEntity): + """Implementation of a GTFS departure sensor.""" + + def __init__(self, coordinator: GTFSRealtimeUpdateCoordinator) -> None: + """Initialize the GTFSsensor.""" + super().__init__(coordinator) + self._name = coordinator.data["name"] + "_rt" + self._attributes: dict[str, Any] = {} + + self._attr_unique_id = f"gtfs-{self._name}_rt" + self._attr_device_info = DeviceInfo( + name=f"GTFS - {self._name}", + entry_type=DeviceEntryType.SERVICE, + identifiers={(DOMAIN, f"GTFS - {self._name}_rt")}, + manufacturer="GTFS", + model=self._name, + ) + _LOGGER.debug("GTFS RT Sensor: coordinator data: %s", coordinator.data ) + self._coordinator = coordinator + self._attributes = self._update_attrs_rt() + self._attr_extra_state_attributes = self._attributes + + @callback + def _handle_coordinator_update(self) -> None: + """Handle updated data from the coordinator.""" + self._update_attrs_rt() + super()._handle_coordinator_update() + + def _update_attrs_rt(self): # noqa: C901 PLR0911 + _LOGGER.debug(f"GTFS RT Sensor update attr DATA: {self._coordinator.data}") + self._attr_native_value = coordinator.data + self._attributes["next_departure_realtime"] = self._coordinator.data + return self._attributes \ No newline at end of file From 36e3b790c9fd6e5f7878edb1d4b1763309ef5b9e Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Sat, 11 Nov 2023 07:32:20 +0100 Subject: [PATCH 14/18] Add library for realtime --- custom_components/gtfs2/manifest.json | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/custom_components/gtfs2/manifest.json b/custom_components/gtfs2/manifest.json index 9c94526..8dbe4c3 100644 --- a/custom_components/gtfs2/manifest.json +++ b/custom_components/gtfs2/manifest.json @@ -6,6 +6,6 @@ "documentation": "https://github.com/vingerha/gtfs2", "iot_class": "local_polling", "issue_tracker": "https://github.com/vingerha/gtfs2/issues", - "requirements": ["pygtfs==0.1.9"], + "requirements": ["pygtfs==0.1.9","gtfs-realtime-bindings==1.0.0"], "version": "0.1.5" } From 57dc03cd4043bedef284529a2c8789b32fe012dd Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Sun, 12 Nov 2023 18:31:40 +0100 Subject: [PATCH 15/18] Update README.md --- README.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index ff8c4a9..a9aa260 100644 --- a/README.md +++ b/README.md @@ -25,10 +25,11 @@ Core GTFS uses start + stop, it then determines every option between them and pr 20231104: initial version -## ToDo's +## ToDo's / In Development - Issue when updating the source db, it throws a db locked error. This when an existing entity for the same db starts polling it at the same time -- Icon for the integration (brands) +- (DONE) Icon for the integration (brands) - bypass setup control for routes that have no trips 'today'. The configuration does a spot-check if start/end actually return data with the idea to validate the setup. However, this only checks for 'today' so if your route actually has no transport running at the day of setup (say Sunday or Holiday) then it will reject it. +- (in DEV release) adding real-time data for providers that offer these for the same gtfs data: initally time and lat/long ## Installation via HACS : From d2cc4a8d983564ac01dffb625386fcca37ec0116 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Sun, 12 Nov 2023 18:49:26 +0100 Subject: [PATCH 16/18] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index a9aa260..ff7f927 100644 --- a/README.md +++ b/README.md @@ -68,6 +68,7 @@ or via yaml ## Thank you - @joostlek ... massive thanks to help me through many (!) tech aspects and getting this to the inital version - @mxbssn for initiating, bringing ideas, helping with testing +- @mark1foley for his gtfs real time integration which I managed to alter/integrate From d667be2b95eb3d2bfe38617e36c7d176ac91742b Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Mon, 13 Nov 2023 12:33:27 +0100 Subject: [PATCH 17/18] Update README.md --- README.md | 14 ++++++++++++-- 1 file changed, 12 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index ff7f927..1425001 100644 --- a/README.md +++ b/README.md @@ -62,8 +62,18 @@ or via yaml ![image](https://github.com/vingerha/gtfs2/assets/44190435/2fea7926-a64d-43b6-a653-c95f1f01c66d) - - +## Known issues/challenges with source data + +Static gtfs: +- not complying to the pygtfs unpacking library, examples: missing dates in feed_info > manual fix +- calendar not showing if a service is run on a specific day > fix via adding calendar_dates to filter, only works if (!) calendar_dates is used alternatively for the same purpose +- missing routes/stops/times, transport runs but gtfs does nto show it > report issue with your gtfs data provider +- routes show A > B (outward) but stop selection shows inversed B > A, within one gtfs source both good as incorrect start/end can show up > report issue with your gtfs data provider + +Realtime gtfs +- few realtiem providers also add vehicle positions with lat/lon, these are not always up to date > report issue with your gtfs data provider +- format incorrect of incomming json/feed > report issue with your gtfs data provider, they should adhere to standards +- realtime data not always available, few refreshes are fine then nothing then fine again, often related to timeout from provider > report issue with your gtfs data provider ## Thank you - @joostlek ... massive thanks to help me through many (!) tech aspects and getting this to the inital version From dc0c9afcaa9a5caeda859d1dd267f1d00439a198 Mon Sep 17 00:00:00 2001 From: Arjan <44190435+vingerha@users.noreply.github.com> Date: Sat, 25 Nov 2023 10:46:54 +0100 Subject: [PATCH 18/18] Update hacs.json --- hacs.json | 1 + 1 file changed, 1 insertion(+) diff --git a/hacs.json b/hacs.json index b07636f..fb6c2e8 100644 --- a/hacs.json +++ b/hacs.json @@ -1,4 +1,5 @@ { "name": "GTFS2 for HomeAssistant", + "render_readme": true, "homeassistant": "2023.10.1" }