Autodesk Wants to Kill the File

edited February 2021 in General

This is an interesting read
https://www.upfrontezine.com/2021/02/upf-1083.html

"Autodesk CEO Andrew Anagnost is adamant. “I think there is something we need to acknowledge right now: that a file is a dead thing working,” he said in an interview last fall with Archintosh magazine."

If people want to come with their thoughts I will try and send something to Grabowski, he's pretty good at publishing readers responses and he's earlier published my comments on 'open washing'. So please, in concise form - what do people need to know about Autodesks stated direction?

I suggest that we try something. After I write a reply collecting peoples thoughts I will tag everyone who contributed and wait 24 hours for anyone who wants to stop the process. If no-one vetos the process I would like to send it still in my name but on behalf of OSArch and linked to this discussion. How does that sound?

Comments

  • Sounds good.

    I somewhat can understand the problem of Autodesk. A huge number of applications with different file formats (Revit, AutoCad, Inventor, 3DMax and so on). And their customers rightly expect that all these applications should be interoperable. Who else should provide this if not the single vendor. That is basically their USP: use all our apps and there's no problem with interoperability. But Autodesk fails this basic expectation.

    "Passing files through translators is to be replaced by code using APIs" - In my view this is a misconception. API's need data translation as well. Thats why you have a parser and a verification behind each well written API. It doesn't matter if the data comes from a data stream or from a file. The important part is, that the incoming data is openly and predictable readable. That's why open data specs are more important than APIs.

    My first thought on this is a try to lock costumer more closely into the BIM360/or whatever cloud services. If there is no file, you cannot take your data away from the vendor. You are at the whim of the vendor to provide an open API access to export your data. And what happens, if the vendor restricts the access to the API for some reason? I agree with the closing sentence of Grabowski: "I am optimistic that the prediction of the death of the file system is wildly wrong, for one simple reason: file systems allow customers to retain power over vendors."

    duncanJesusbill
  • edited February 2021

    Moving files to the cloud does not mean there is no place to store data ..
    The point in interoperability issues is that most adsk apps rely on microsoft memory snapshot as way to store file, so each app version / compiler (dev studio) generate different file format not compatibles - or made upward compatibles by hand, and closely related to underlaying application - this means anti-interoperability by definition.
    Vendor brain-washing bullshit over the top !
    On the other hand, a common datamodel definition as bim has proven to be hard to implement correctly without any incentive to do so from market share point of view.
    Last but not least, adsk attempt to create a common file format for own applications (fbx) lead to such mess than even adsk is no more able to read them.
    So definitely yes, adsk proved that reliabily writing and reading files is someting they are not able to do.

    duncan
  • edited February 2021

    From what I understand, Speckle is doing basically the same thing, only open. I really can't wait for the presentation on the weekend, since I haven't found the time to read how they approach this. (And I am personally rather sceptical about this approach)
    I'm really curious how is Autodesk, especially with it's reputation of ruthless profit oriented bully, going to try to convince the customers that them not only having no permanent access to the software, but now even to the data, is beneficial to them.
    (Even though this could have been easily expected and the next step might be cloud computing, the client doesn't need anything, only a big display, a lot of money and a blind trust in Autodesk)

    duncan
  • "Kill the file" is another way of saying "lock into subscription-based cloud services". I believe that is their intent. This intent is highly beneficial to Autodesk, and highly harmful to the industry, which includes me, and you. We need to help educate the public that this direction that Autodesk is taking is not something they should support.

    From a technical perspective "file-less" is incorrect. At the end of the day, computers work with files. Perhaps a better way of explaining it is the traditional file-sharing paradigm is dead. I.e. this is the paradigm where you save a local file, you share it on a network, someone downloads it, and operates on the local file via their local machine. There is nothing wrong about this at all in principle, and in fact, I believe it is generally positive. For example:

    1. Converting data syncing from a dropbox approach into standardised APIs streamlines the data sharing process which is beneficial to users as they get more seamless collaboration workflows. Speckle is one approach of this.
    2. Doing operations / computation on cloud hardware instead of local hardware allows faster analysis on complex problems. This is beneficial to users as they get their analysis results faster. Pollination Cloud is one approach of this.
    3. Storing data in a database-like or queriable format (remember - this is still files from a purely technical perspective!) allows users to be less concerned with the issue of "how do we segregate data for easy transport / sharing" and they can instead treat the entire built environment as a growing database where they work on a subset of the world. This is beneficial to users as their design can be more easily be treated holistically instead of "within the red site boundary". QGIS Server and OSM is one approach of this.
    4. Storing data on cloud servers allows easy access to the source of truth for those unfamiliar with projects or those who aren't working in the trenches on a job. It makes it seamless for a web interface, dashboards, and online viewers for people who don't need to install local software to check things out, or even full web-based authoring programs, useful for countries where local hardware cannot be afforded. This in general is all beneficial to users.

    So, "more cloud, less local files" certainly does bring many benefits! These are the benefits we should support. However, sneaky marketing may encourage myths that we should catch and debunk. These myths twist slowly blur the lines between the benefits and the negatives - which are essentially vendor lock-in, loss of data sovereignty, blind faith in the cloud, and perpetual subscription.

    Myths include:

    1. IFC is dead. APIs are the future! This is False. As @doia mentions, APIs need translation. IFC provides this. Further note is that IFC is not a file format - it can, and is already treated as a database.
    2. Files are dead. APIs are the future! This is False. Files and APIs are complementary. Forcing the latter hurts users.
    3. Files are dead. Databases is the future! This is False. Databases are files, and the two paradigms are complementary.
    4. Cloud analysis and AI is the future, check it out! Sorta - yes, scalable computing is good, but black box is bad. Watch out for cloud services who promote black box analysis that encourage ignorance of design and simplistic analysis that misleads design decisions. Users need to think and understand services, not place blind trust in the cloud.
    5. Cloud collaboration is the future! Sorta - yes, collaborating on the internet is good. Collaborating but being stuck to only one vendor is bad.
    CyrilbitacovirkaiaurelienzhtheoryshawduncanJesusbillCGRgokermu
  • edited March 2021

    When working with cloud storage, the old Joni Mitchell song always comes back to me..

    I've looked at clouds from both sides now
    From up and down and still somehow
    It's cloud's illusions I recall
    I really don't know clouds at all

    Or the old Jimmy Cliff song

    I can see clearly now the rain is gone.
    I can see all obstacles in my way.
    Gone are the dark clouds that had me blind.
    It's gonna be a bright (bright)
    Bright (bright) sunshiny day.
    It's gonna be a bright (bright)
    Bright (bright) sunshiny day.

    Cyrilmagicalcloud_75CGR
  • ofcause they want people to be locked into their system.
    they earned 900 mil at 2020 Q4 and they have no plan to stop that
    https://www.northbaybusinessjournal.com/article/article/autodesk-reports-1-2b/

  • One of the first software vendors that preached the "no-use" of files is Onshape, a SaaS CAD platform. I remember some years ago they were blogging about the "bad practice" of using files and did not have any possibility to import/export from their platform. Nowadays they do have import/export features and I think they do not talk about this that much.

    Now I see the link Duncan provided does talk about OnShape, pretty good post actually.

  • The API or serialisation via file go hand in hand. IFC can work both ways. It provides an open standardized data scheme. Eventually it shouldn’t matter whether you share information according to the IFC schema or using one of the many serialisation formats (step, xml, json, rdf/owl). Yes, full access to your data is crucial! And I hope that this will be decoupled from a specific tool in the end. Think web: you use html, css, JS and json primarily, but have freedom in your toolset.

    JesusbillMaoZerhouni
  • I have the clear impression that I'm not going to get around to writing a synopsis to send to upfrontezine which would be a shame as he'd almost certainly print it. Anyone here willing to write a short version of this discussion we can send to him?

  • @duncan what are the requirements (e.g. word count, images), and is there an example of previous content we can use as a benchmark?

  • @Moult the website is here: https://www.upfrontezine.com on one long page, there is usually a section called 'Letters to the Editor' where you can see examples.

  • Cheap cloud solution .. Saas down for 3 weeks + data gone in smoke.

    JesusbillSigmaDimensions
  • @stephen_l said:
    Cheap cloud solution .. Saas down for 3 weeks + data gone in smoke.

    Well... Now everything is in a real black cloud. :D

    JanFJesusbill
  • edited March 2021

    I keep my 'datadrop' as little and lean as possible. Transfer to client as package using good old email before model grows out of hands. This way the risk is spread by using copies and recievers. Email => pdf / dxf /ifc of finished work. How is 'killing the file' help me make it more secure?

  • edited March 2021


    Autodesk forsees a lot of things. Maybe the bottleneck is the pace of it..

  • Would like to offer a different viewpoint here.

    I have very little experience with Catia which is about as closed of a system as you can get and leverages the cloud heavily for file storage and interoperability, and sitting with the the Catia guys, going through their model and trying to understand how it works, it makes sense for Catia - there are workbenches like freecad, but they have opted to separate them into different apps, sometimes with different file extensions. Also, some of the lighter apps are completely in the cloud, like their version of Grasshopper/Dynamo called xGen (by the way it's the best node based geometry based apps I've used, as it has also automatically builds a tree, for non technical users, but a different topic), and also their file interpolator that can export files to Revit, rhino, etc is a cloud app. This means that some of the heavy computation is done in the cloud, and that the cloud-based apps could constantly get updated without massive overhead. Their ecosystem is also designed to be modularly shared, for example for structural engineer to be able to tap into some of the files, mechanical in others, and so on.

    From the perspective of what it could be, I can see why Autodesk might try to push it. Also, from the perspective of an architect working professionally with Revit by necessity of working in a large company, I can also see some benefits of having an asset manager-like system for projects, something like BIM360 on steroids.

    The problem is that it's Autodesk, which means it will be messy, and would likely result in doubling up, with files/asset management in the local company's cloud and one on autodesk. Some revit projects that are a live collaboration between different offices are already completely cloud based and the trouble there is that you need a BIM manager being able to help on the project for any time you are doing something out of the ordinary, like trying out a new option sketch. There is zero flexibility for designers. I am however, not sure how well sharing files works with a wider consultant firm for firms that use BIM360 extensively.

    Yet, these trends are hard to stop. Remember the beginning of the switch from perpetual to subscription licensing models? Sketchup, also has a ton of new cloud-based functionality. Also, I do agree, that it's hard for it to be the death of the individual file. I remember iPads before Apples released the files app. It was a black box, but the files did exist, just the user did not have access to them. They got smart and finally introduced a files app, realising that individual files and file organisation is certainly not dead.

    Moultduncan
  • That's a great summary which hopefully resonates with my earlier response too: "Many positives as a general concept, but it's Autodesk we're talking about here..."

  • Since we're bashing Autodesk for a moment ... we use BIM360 on almost all projects now. Certainly anything started in the last year. Autodesk can't even get Revit Links, Linked IFCs, Linked Point Clouds and Linked NWD files to all work in the same way. They each work slightly differently or badly. It's what's technically known as a shit-show.

    Moult
Sign In or Register to comment.