SoCal AEC Hackathon 2.3

Team name: Owen Derby and Brok Howard from Team VRML (Virtual Review Management Loop)

[The original VRML can be found here – much respect to those champions who paved the way before us]

What Big AEC problem are you trying to solve?

The design process includes change, lots of interactions and fast. Communication with clients can be a challenge when trying to include them in the process. In recent months the evolution of Vurtial Reality has started to get a stronger foothold in the AEC industry. We are not building games, but real human experiences in the AEC world. We do not experience this through the flat 2D drawings or high resolution renderings. The hardware is starting to catchup to the dream of Virtial experience prior to construction with advancements in heads up display gear.

Using an Oculus Rift DK2, Revit, and open source we have linked the hardware, the design and the web. Using three.js and webgl and an OBJ file format we are able to view and move around using a url.

The client can simply open a browser link from an email, connect a future commercially available Heads Up Display (HUD) like Oculus Rift (launching next year) and see the design. “Can you change ‘this here’?” ask the client. Designer updates the design, saves and uploads the file, the browser refreshes and the client sees the change. They continue this process with the client using the same link. The url site can be accessed by anyone with the link and can be experienced by any number of people at the same time.

Can your solution be implemented on Monday?

Yes, this is mostly using existing tools and open source software.

How much of the code was built this weekend?

Because we’re mostly reusing lots of open source code, very little was built this weekend. However, all that was built by us Was built this weekend.

Technical difficulty?

Medium (Easy for the designer, medium challenge for developer). Using a very new web VR compatibility limited us to using the nightly build of Chromium, using a developer kit of a yet to be sold hardware Rift, and file sync system (in this case detopbox) created some challenges, but a simple understandable process.

Bonus: What open standards do you support or use?

We used OBJ open 3D file format, threeJS rendering in browser, and webVR open standards.

Bonus: Is your project open source? Yes, here it is on

Here is us at AEC Hackathon 2.3 in the live demo on YouTube.

AWE Day 3

The AR heads up display UI designers had to have been inspired by the presentation by Jayse Hansen showing how he designed the HUD for Iron Man and Big Hero.

Helen Papagiannis plugs her book, not much was said.

Ralph Osterhout ODG R-7 and R-8 – 15 degrees of focus – suggest combine AR with VR. Why don’t we let technology control us, why not force technology to work for us? Good question Ralph, now can I get your technology for cheap?

Jon Cabiria said 1 in 3k startups survive after 5 years. Autopsyio – 

Johnny Lee – project lead Project Tango – Motion Tracking, Area Learning, and Depth Preception. I might just have to get a Project Tango device to explore at the next AEC Hackathon. 

AR vs VR Debate was awesome! 

Check out the Inspire Session on YouTube

AWE Day 2

Today we started in the big room with main stage keynotes. In general I was hoping for some inspiration and awe, what was presented reminded me more of a product LAUNCH presentation. One after the other, elevator sales pitch presentations. I did learn that HP has their own AR platform called Aurasma using Aurasma Studio. RJ Holmberg shared a bit.

Kayvan Mizra from Optinvent talked about their eyeware glasses they are shipping made of molded plastic ORA-1 for Enterprise use and touched on the social acceptance is the big barrier and are addressing that soon with their ORA-x. It looks more like headphones with a drop down eye lense on one side. 

Catchoom shared what they have been focused on, image recognition and making it available offline with their new craftar SDK.

I really like the ideas behind Scope AR – David N….. and Remote AR where you send video from expert to technician as a virtual support effort, like having someone looking over your shoulder. They have incorporated their technology into enterprise eyewear.

ODG Pete Jameson R7 smart glasses were very impressive, in short they are everything your smart phone is but in the form of glasses. I tried them out on the exhibit floor, there must be some intense processing going on cause they got hot to touch along the top.

Teri Aaltonen from Augumenta shared their virtual keyboard using nothing but hand gestures. They published a SDK last year and have a new SDK with several updates.

Zappar is one of many who are the business of content AR creation tools, but they seemed much cooler, maybe it was the accent and the fun game examples.

Pete Augmate shared a big challenge in deployment with what they called Glass at Work and offered a solution called WE (Manage) – Wearable Evironment. They intend to solve the implementation challenge with an integrated system. If I was looking for a solution, I would check them out.

Third Space Agency (Founder of Second Life) is working on emersive locations for artist.

I really enjoyed learning about the Bjork film MOMA instillation – check out YouTube called “Black Lake”. Brian Pene from Autodesk shared how some of their tools were used in the project including Autodesk Memento Beta. 

Xrez Capture 

Object Based Audio BarcoT and Twobig_ears Chris Pike from BBC R&D shared how the audio integrated with the full AR video experiance. Each of the 30 string recordings were placed in 3D space so you can explore the arrangements as you walked through the installation.

I went to a Smart Glasses Introduction where Astray and Shatter isle shared some insight. 

Future of play is physical with the digital was the big point from Haptics where they are linkinking AR technology with Nerf games. Laser tag people and virtual zombies. 


John Shulters from Treehouse Designer ascension uses photogrammetry to capture reality to help designers make professional tree house design accurate.

Index AR Solutions shared how they are providing AR solution for complex  Newport News Ship Building projects.

Then I hit the Exhibit Floor.

Check out these session from the Demo Day on YouTube

AWE 2015 – Day 1

Today was light on learning as today was volunteer day, and that was fine. I got to scan people’s badges. They used RFID badges using NFC. Not sure what that means? Your not alone, had to look it up myself. Basically the little plastic ID badges each attendee had tethered around their necks has a little RFID tag inside, so that when I touched the back of the NFC tech I had in my hand with the badge a little bit of data transferred from the tag to the hand held device. This is not new technology, but it get me thinking…why are we doing this? I guess those running the show can review live data on who and how many attendees are in the room. It also automatically sent an email to the attendee to fill out a survey. (Although I heard the link in the email did not work to launch the speaker survey) It was a bit awkward touching my hand held to people badges, even more when the went hands free and I had to either grab their badge or just touch the hand held to their mid torso, “tag me!” they would say with a grin. Maybe in the future AR conference they with simple use RFID gates and when attendees walk through the door they get scanned (both when they arrive and leave). I also observed the classes that there were a few classes that could have been in a room twice the size. But, I guess that is what you get when attendees don’t sign up for classes and can easily roam from one session to another. Three really popular classes that had large crowds at the door were on eyewear technology, 3D Mapping the World, and Internet of Things and AR – all in the same tiny room. I will wait to review those when the videos are published.

I did get a session in the morning, How to Choose Enterprise Use Cases for AR. It was interesting to hear to over arching issues that face real enterprise implementation. I heard a reoccurring theme, have a plan and know the end goal. This concept is something I can relate to. Juergen Lumera from Bosch was my favorite speaker of the bunch, mainly because he was entertaining and quotable. He said, “don’t listen to scientists and engineers (like himself) to find use cases” They are not the people who will use this technology, you need to, “get the perspective of technicians and end users” He also stressed to have a test case end solution in mind. Know where you are going, what are you trying to solve? Where are you trying to save effort or money? Know your ROI goal, “otherwise you will never finish and your boss will think you are just playing with cool stuff.” 

I liked the example from John Simmins from EPRI on Smart Safety Glasses solutions. In the field they already need eye protection, adding features to support their job as well as functioning another purpose makes for an easier case.

APX shared what they have learned with integration, they are facing the challenge using existing IT Enterprise systems. They suggested designing  a clean workflow so as systems change the solution is scalable. 

I learned of a new group trying to push AR forward called AREA Augmented Reality for Enterprise Alliance

And I learned a new word from Jonathan Zufi from SAP when he was talking about OData – CRUD create, read, update, delete. 

I am looking forward to the keynote on day 2, panel discussion hosted by Damon Hernedez, and the exhibit floor. I also remembered my boost charger so I can burn up my Twitter feed today. #AWE2015 

Check out what sessions you might have missed on YouTube

Carl Callewaert from Unity Unity 3D for AR

Roy Ashok from Qualcomm Vuforia Apps for Toys

AWE 2015 – Why I am Going

The Augmented World Expo 2015 starts today. I learned about it through @VDCwhiz Cesar Escalante at work. He and I are volunteers at the conference in order to get free conference passes. The conference is at the Santa Clara Convention Center. I am taking Caltrain to the conference, my first trip using the “real” train south. It’s only a 2 hour trip door to door, but I truly feel like I am leaving the city for a conference, but get to sleep in my own bed each night. I hope to get some work done during to commute.

I plan to get my badge, take a few classes this morning and volunteer this afternoon. I then get full access the next two days.

Why am I going? This is not your typical AECO conference. In fact, I don’t expect to see many from “our” world there. There is one session tomorrow with my friend @MetaverseOne Damon Hernedez, hacker behind @AECHackathon. His panel will be talking about AR in Real Estate and Construction. I am going because this will enter the AEC market soon. We as an industry are typically slow to evolve, embrace change, or implement new technology. Microsoft Hololense is getting attention as of late and since the SketchUp team is trying to integrate it; Architects at the AIA are now gaining interest. Construction has started to show interest from what I observed at FutureTech and the BIMForum. Players like Daqri have been showing of their AR integrated hard hat. I plan to check it out on the exhibit floor. Many design firms want to put their creations in gaming engines, but they don’t know what that means. They download a free version of Unity and hope to create an Xbox experience for a tech savvy client. Google ended their developer conference keynote last week with VR, showing the new 360 GoPro drone with 360 Hero coming soon. Why end a huge keynote at a critical conference if VR if it’s not of interest to an important innovation company like Google. I know I am paying more attention. I am going to see what more I don’t know and could learn about Augmented Reality. Could AR be the next disruptive technology? I plan to find out this week.

Follow the conversations via Twitter #AWE2015

Discover -> Share -> Document -> Repeat

I am a process guy. I love helping to improve workflow. I am also an ideas person. I might not know all the answers, but sometimes I get the concept or initial idea right. That is one of the reason why I love hackathons, like the one coming up in SoCal, London in July and three months in Seattle. You get to take ideas and test them really quickly. I have also learned, recently working with software developers, feedback is more valuable than you would think. They try something, find out it if works (or if you like it) and then implement it. The feedback loop cost time and money, so the quicker the response, the quicker to market. 

You can also see how the world works with this concept at a higher level. I think of how I use Twitter. I follow the people I would like to have a drink with (coffee or otherwise). If they share something I want others to hear…I re-share it. Sometimes I have something to share and hope that others will share it. I have found that twitter is my filter to discovery. I have a sea of people looking and finding…for me in a way. I don’t have to follow and read every blog post or new article. If I tried, that is all I would be doing. Instead, I follow those who follow. I also follow some of those who write. In many ways I am (when on twitter) in a room of like minded people – an incubator of idea sharing. People who share are my kinda people. When you combine twitter – call it discovery – with blogs (like my little site) you start to expand upon the 140 character limit on twitter to a larger perspective. That is also why I love sharing videos, still not sure if I will leave YouTube and fully go to Vimeo…or something else. [YouTubes/Googles latest efforts to deliver VR might swing me back] But when words are not enough, video is great! The blogs and videos are great for sharing and documenting all at once. So, when you put it all together. Discover via a tweet or post – share with others via retweet or post – document via post or video – then re-share it. The cycle never ends.

And sometimes is just cool stuff on the line