I just started exploring BlenderBIM, and I am investigating its potential use to store a rather complex reconstruction model that was developed for lighting simulation. I face three challenges right now:
1. I need to instance objects that occur multiple times to limit memory demand. To my understanding, the aggregations discussed above do this?
2. We have textual information and drawings that give background information on objects - this should work with annotations.
3. There is a lot of information reflection and transmission properties, that define the geometries's appearance in lighting simulation (with Radiance).
In particular for 3., I would be curious how far the implementation of external material definitions is right now. In the announcement, the possibility to store e.g. .mat-files for Radiance is mentioned. Is this data embedded in the IFC, and if so, when and where is it written out? Or is is just a link to external files that are included as requirements, but not linked if I would e.g. export the geometry from an IFC (typically as Wavefront OBJ for import into Radiance)?
@lgrobe g'day and welcome to the OSArch community! A little round of introduction for those reading who haven't seen @lgrobe before - he comes from the Radiance community and is an amazing lighting simulationist to be reckoned with!
I had previously used the BlenderBIM Add-on to recreate the living room example (the picture on the BlenderBIM Add-on homepage, which you might remember). There are also a number of private projects I ran it on, which were never shared. So I believe it has potential. However, a lot of things have moved since then and the scripts and methodology I used since then needs to be heavily revised. It handled complex material definitions (e.g. glazing), textures, and lots of instancing. So everything you're asking for is available, but the add-on in its current form (it just underwent a complete rewrite, so is slowly regaining features) does not have all the buttons and functions for a workflow. That said, if you are keen, I would love to collaborate to make it a reality.
Building support for this would require first building the UI to edit the lighting properties and linking to external files. Then, some form of automated OBJ conversion + Radiance instancing definitions generator would be good. Finally, automatic IFC->mat writing, would be great too. Thoughts?
thank you for the warm welcome and the reply to my beginner´s questions! Maybe I should share the background of these questions, that may be a bit atypical as a BIM application:
We have a very detailed lighting simulation model, that evolved from 20+ years of research. To handle the memory with the little resources available at the time (a PentiumIII with 128MB RAM was considered a work-station back than), we modeled building parts as separate files that were instantiated (in CAD or, in Radiance, as pre-compiled octree instances) by a rather complex Makefile traversing the model's directory structure. Besides 3D geometry, there were separate material descriptions (for Radiance) and documentation (2D construction lines, texts, photographs) kept with the objects. While this was innovative two decades ago, I would like to consolidate the model into something that can be parsed by CAD software, and output not only to light simulation. I had first thought of Collada, since I do not have the typical BIM information, but attaching non-3D information is not well supported in a 3D modelling format. X3D would be another option, but again 2D support is rudimentary. So I came up with the idea to use the Python environment in Blender to collect all the model elements, add them to a BIM model, and come up with an IFC that holds all 3D model data, simulation-related material data, and the 2D or textual documentation. Unfortunately, this has the status of a spare-time project right now, since the model results from some prior research projects. Nevertheless I would like to make some progress here, and we might include some colleagues who might contribute some resources if this promises general applicability.
Following up to your answers:
1. OK obviously I got confused here, so it is mapped representations that I should remember. I guess that creating them in the import process described above should be no big problem. Are BlenderBIM's mapped representations represented by instances in Blender? If I understand it correctly, that should allow any Blender exporter to account for the shared data without BlenderBIM-specific adaptions (e.g. Collada, X3D, or vi-suite).
2. The annotations, as mentioned above, should collect the underlying assumptions that led to the 3D model in my case. This is different from many typical BIM use cases. It would be great to have wireframe drawing transformed with the 3D model data that they are related to (e.g. the construction lines of an arc that is modelled in 3D), but even linking to external resources would be a feasable approach (and needed e.g. for photographs, text sources etc). Not sure if this is stretching the concept of annotations a bit too far though.
3. External material maps would definitely be an extremely powerful means to keep information together. If this would be functional and stable, one might aim at a common interface in other simulation tools (coming from the Radiance applications, having vi-suite use such linked data would be an obvious approach). Is there a standard way for an exporter to recognize an external map relation? Can binary resources be linked, too? In Radiance, I might have image maps, XMLs containing BSDF data etc. Maybe this is a good starting point for me to look at the code, since I know what kind of data I need to encode.
@lgrobe ahh, so you want to take existing CAD geometry and metadata, then combine them into an IFC? I was under the impression of the opposite - using IFCs to then attach extra lighting specific metadata and feed into Radiance simulations.
Do you have a simple case we can work on together to build a first minimal working example?