You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi its very interesting your work , i love blender and more with sensor ¡¡¡.
Its posible to link or integrate your develop as a pluguin of home assistant (https://www.home-assistant.io/)?
Thanks for your time
The text was updated successfully, but these errors were encountered:
Thank you @enrutador for your considerations.
I had the idea in mind indeed to interact with higher level platforms.
I started the proof of concept with the most basic light with the native manufacturer REST API, then a generic MQTT.
In order to scale up the project, I'm realizing that it is beneficial to link it somehow to a framework that is wrapping multiple devices in a single environment. This way, this project can focus on the 3d interaction and does not re-invent the devices interfacing which is already well done by others.
As I'm not much into Java and prefer Python, Home Assistant is indeed one of my favorite choices.
I did not explore the home assistant community yet, but if you know it well this could be a starting point, if you have any advice or recommendation.
My thoughts regarding interactions with a Home Automation Framework
3D interface efficiency can always be criticized and be compared to a 2D interface from usability perspective. I still see some major principles that have yet to be applied in order for a 3D interface to give an advantage over a 2D one. That is resolving occlusions of interactive items, by design on placement and by restricting the camera view, at least the initial view. Highlighting interactive elemenets (done already with effects). Navigation between Zones, help the user go to the optimal view of a room without having to twist and translate himself just by one click. Hiding irrelevant elements if the user is not interested in e.g. light.
I started this project with the idea of having a completely independent webapp. Which was hit hard as soon as I struggled with the CoAP and websocket issues that Mosquitto does not have by default. So splitting it in front end back end is something I'd keep as a variation but is necessary to extend the use cases. One way of resolving this could be not to have an own back end but rely on Home assistant for example.
I will definitely try to attract the attention of Home Assistant community and let them decide of any interesting way how to connect it, as this project has MIT and no wired license restrictions, I would gladly give all required support if someone needs help. Yet if I think about it, I see two different ways. One 3d panel inside Home Assistant between others. Or one independent url that is full screen.
I also think that this is just the beginning of 3d interfaces and cpu/gpu power of mobile devices needs still a certain number of years before becoming convenient for most users. There is no limit on the realism we could add on the 3d app, from simple shading till real time ray tracing of correct sunlight global illumination and then inclusion in a VR platform, it's an exiting future for home automation. Yet some criticism says the home has to be smart without user control, and some others prefer to have the last configuration step on their hands.
Hi its very interesting your work , i love blender and more with sensor ¡¡¡.
Its posible to link or integrate your develop as a pluguin of home assistant (https://www.home-assistant.io/)?
Thanks for your time
The text was updated successfully, but these errors were encountered: