Goal:
- Create an API that when provided an JPGVR image, returns encoded XMP metadata.
- This metadata includes audio, left lens image / right lens image, and dimension data
- Create an aFrame XR Web app that utilizes this JPGVR API to display images on the Oculus Quest web browser and play sound: Adapt this to the web from unity, and make it better
My current collection of .vr.jpg
photos: Album
Tech:
- Rust with rexiv2 to convert files wrapped with Node JS to deliver files in a rest api.
- aFrame
- Typescript + WebAsm
Steps:
- [ ] Finish "learning rust" guide
- [ ] Save an apk / ipa of google cardboard camera just in case it gets randomly: https://killedbygoogle.com/
- [ ] Use rexiv2 to convert a .vr.jpg file into parts
- [ ] Design an api around what parts of XMP metadata are valuable.
- [ ] Webapp that runs the conversion in web assembly
- [ ] Dev usability goal: typescript around web as bindings?
- [ ] Design an aFrame app that runs on the quest browser that displays a left / right image hard coded as a static asset
- [ ] Add audio
- [ ] Basic Oculus Touch controls support
- [ ] Add pinned image wheel rotation around you with hand pinch
... figure out the easiest implementation of some sort of file storage to integrate
- maybe github sign in or something?