I recently discovered the industry significance of thought leader Bret Victor in the field of Human Computer Interaction. He is a computer scientist and interaction designer who were involved in the early development of iPad. Alan Kay described him as “one of the greatest user interface design minds in the world today.” After watching a few talks and read a few articles by him, I began to wonder his impact on the AR landscape today. This is the first part of my reflection on his principles and personal take on how they have influenced the AR interactions today.

Personally my passion aligns with Bret's, make tools for creators. Having been a creator myself, I found myself more interested in the tools then producing content. I watched two of Bret's video where he talks about how to make tools for artist to create intuitively and expressively. In this post, I mainly want to talk about how the AR creation tools are now, and how Bret's experiments may have made an impact of how tools are created today.

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/72a1cef6-7bdb-4552-b2f7-04f20f7cb01d/maxresdefault.jpg

In Stop drawing dead fish, he started the talk discussing how computer is a different medium than what came before. He believes that drawing a silhouette of a fish is good for paper, but not good enough for a computer. On a computer screen, we expect things to move so they should. His thought is that we should make arts interactive that is native to the medium.

Coming to AR landscape today in 2019, I believe the same is true. First AR is a 3D environment, so what creators augment in such environment should at least have volume and depth. Second, the new thing an AR environment affords, and is unique to AR, is interact with the perceived physical world. It not only make the experience more believable but also put its scale in context.

In his second section of Stop drawing dead fish, he demonstrated how he live manipulate a fish with some pre-programmed behavior, and discussed the difference in the roles between what part an artist should do and what part should be taken care of by the computer.

His context was to create an animation as the outcome, where mini behaviors that are consistent (the fish always move its tail), and artists control higher level interactions (when encountering a shark, the fish is first stunt and then move away as quickly as it can). In AR context, there are

1. Tools to make AR tools

"Tools to make AR tools" still lies in the a more technical development world, developer are using ARKit for iOS and ARCore for Android to do slightly lower level programming.

2. Tools to make AR experiences

In Tools to make AR experiences, the Reality Composer by apple is the perfect example. It lets users to visualize the assets in 3D and preview AR in one tap. Featuring a common design tool interface, with property panel on the right whenever an object is selected, creator can see immediately see how tweaking a parameter is impacting the 3D representation. The roll division between artist and computer also applies here, for example you can set "Bounce" or ten other behavior on an object as "nice to have" effects, while which path to move the object and when exactly, is based on the creator's artistic instinct, or in Bret's word "come from listening to the emotion in your body".

3. End user using AR tools

In the third category of the end user using AR tool, it's mainly industrial applications where I haven't had in person experience yet.