BlenderBIM: Lighting simulation

This discussion was created from comments split from: Random BlenderBIM questions / troubleshooting.


  • Dear all,
    I just started exploring BlenderBIM, and I am investigating its potential use to store a rather complex reconstruction model that was developed for lighting simulation. I face three challenges right now:
    1. I need to instance objects that occur multiple times to limit memory demand. To my understanding, the aggregations discussed above do this?
    2. We have textual information and drawings that give background information on objects - this should work with annotations.
    3. There is a lot of information reflection and transmission properties, that define the geometries's appearance in lighting simulation (with Radiance).
    In particular for 3., I would be curious how far the implementation of external material definitions is right now. In the announcement, the possibility to store e.g. .mat-files for Radiance is mentioned. Is this data embedded in the IFC, and if so, when and where is it written out? Or is is just a link to external files that are included as requirements, but not linked if I would e.g. export the geometry from an IFC (typically as Wavefront OBJ for import into Radiance)?
    Best, Lars.

  • edited February 22

    @lgrobe g'day and welcome to the OSArch community! A little round of introduction for those reading who haven't seen @lgrobe before - he comes from the Radiance community and is an amazing lighting simulationist to be reckoned with!

    I had previously used the BlenderBIM Add-on to recreate the living room example (the picture on the BlenderBIM Add-on homepage, which you might remember). There are also a number of private projects I ran it on, which were never shared. So I believe it has potential. However, a lot of things have moved since then and the scripts and methodology I used since then needs to be heavily revised. It handled complex material definitions (e.g. glazing), textures, and lots of instancing. So everything you're asking for is available, but the add-on in its current form (it just underwent a complete rewrite, so is slowly regaining features) does not have all the buttons and functions for a workflow. That said, if you are keen, I would love to collaborate to make it a reality.

    Your questions:

    1. Aggregations are a different concept - basically allowing the grouping of objects from subobjects. Instead, the term you're after is called mapped representations. This comes for free for most BIM models coming from other authoring apps. If an object is part of a typical construction type (i.e. a chair), each chair would be an instance of that chair type. This translates well to something that can work with xform statements to place it around a scene. I would need to build a translator to create these statements.
    2. Annotation indeed can embed text and arbitrary polygons in a scene, but I might need a bit more clarification on what you need to know how it might work.
    3. Alas this is one of the regressions that happened during the rewrite, so you won't find it in the BlenderBIM Add-on UI until I rebuild it. IFC can store simple lighting properties, with optionally a couple more properties on a material. Notably missing is roughness, I believe. For more complex definitions, we can forego the IFC definition and simply link to an arbitrary file, like the .mat files. Note the link includes both a URI and an identifier, so that you can have a many-to-one relationship between materials and mat files, which is good for complex scenes. Conversion to OBJ is still needed - as that is what Radiance reads.

    Building support for this would require first building the UI to edit the lighting properties and linking to external files. Then, some form of automated OBJ conversion + Radiance instancing definitions generator would be good. Finally, automatic IFC->mat writing, would be great too. Thoughts?

  • Dear Moult,
    thank you for the warm welcome and the reply to my beginner´s questions! Maybe I should share the background of these questions, that may be a bit atypical as a BIM application:
    We have a very detailed lighting simulation model, that evolved from 20+ years of research. To handle the memory with the little resources available at the time (a PentiumIII with 128MB RAM was considered a work-station back than), we modeled building parts as separate files that were instantiated (in CAD or, in Radiance, as pre-compiled octree instances) by a rather complex Makefile traversing the model's directory structure. Besides 3D geometry, there were separate material descriptions (for Radiance) and documentation (2D construction lines, texts, photographs) kept with the objects. While this was innovative two decades ago, I would like to consolidate the model into something that can be parsed by CAD software, and output not only to light simulation. I had first thought of Collada, since I do not have the typical BIM information, but attaching non-3D information is not well supported in a 3D modelling format. X3D would be another option, but again 2D support is rudimentary. So I came up with the idea to use the Python environment in Blender to collect all the model elements, add them to a BIM model, and come up with an IFC that holds all 3D model data, simulation-related material data, and the 2D or textual documentation. Unfortunately, this has the status of a spare-time project right now, since the model results from some prior research projects. Nevertheless I would like to make some progress here, and we might include some colleagues who might contribute some resources if this promises general applicability.
    Following up to your answers:
    1. OK obviously I got confused here, so it is mapped representations that I should remember. I guess that creating them in the import process described above should be no big problem. Are BlenderBIM's mapped representations represented by instances in Blender? If I understand it correctly, that should allow any Blender exporter to account for the shared data without BlenderBIM-specific adaptions (e.g. Collada, X3D, or vi-suite).
    2. The annotations, as mentioned above, should collect the underlying assumptions that led to the 3D model in my case. This is different from many typical BIM use cases. It would be great to have wireframe drawing transformed with the 3D model data that they are related to (e.g. the construction lines of an arc that is modelled in 3D), but even linking to external resources would be a feasable approach (and needed e.g. for photographs, text sources etc). Not sure if this is stretching the concept of annotations a bit too far though.
    3. External material maps would definitely be an extremely powerful means to keep information together. If this would be functional and stable, one might aim at a common interface in other simulation tools (coming from the Radiance applications, having vi-suite use such linked data would be an obvious approach). Is there a standard way for an exporter to recognize an external map relation? Can binary resources be linked, too? In Radiance, I might have image maps, XMLs containing BSDF data etc. Maybe this is a good starting point for me to look at the code, since I know what kind of data I need to encode.
    Best, Lars.

  • @lgrobe ahh, so you want to take existing CAD geometry and metadata, then combine them into an IFC? I was under the impression of the opposite - using IFCs to then attach extra lighting specific metadata and feed into Radiance simulations.

    1. A note that the BlenderBIM Add-on is not an import/exporter. It does not function by creating a Blender scene in a particular way, and then pressing the export button. Instead, it actually directly manipulates IFC data in memory, based on functions you run within Blender. I guess what I'm saying is that linked data instances in Blender do represent IFC mapped representations, but merely linking them in Blender does not magically map them in IFC. Also, there is an additional requirement that mapped representations belong to the same construction type. Don't worry, it's a convoluted way of explaining the details, but fundamentally IFC mapped representations are very similar to Blender's linked data.
    2. IFC can indeed store related arbitrary external documents, like photographs and text sources. This may be what you're after.
    3. Externally defined surface styles in IFC use a URI - they do not embed data in the IFC schema itself - so the nature of the file, whether binary or otherwise doesn't matter. In theory, there is a way to store image blobs inside the IFC schema, but I do not know anybody who has implemented this. Maps which rely on UV coordinates may be tricky, though. It is possible to store UV maps much like the OBJ+.mat combo workflow in Radiance, but I have not built this functionality yet, so there is a bit of work.

    Do you have a simple case we can work on together to build a first minimal working example?

Sign In or Register to comment.