Disable ScreenCapture2D Every Frame
This week I'm able to stop the ScreenCapture2D to render every frame to the spawn object. However, I still couldn't stop the already spawned to stop updating.
Gabe | Spring 2018 | Tue. 3.00pm - 06.00 pm | Thesis Journal |
This week I'm able to stop the ScreenCapture2D to render every frame to the spawn object. However, I still couldn't stop the already spawned to stop updating.
Screen Capture 2D
If the two performers sit next to each other in the real and see each other next to each other in the VR world (only at the beginning) too. How would I unlock the HMD presence in VR to not be dependent on the user’s head?
How to add the second and third and so-on plugins into your working project?
I can add the first plugins to the project file by creating a C++ project. But for the second and third, I couldn't do it and I ended up recreating the whole project file from scratch and re-added the plugins at the same time.
UE Blueprint
Instantiate object (wall/rectangle with colliders/physics engine) from touch gesture and place the object around the performer with materials that is captured from the performer's phone screen.
The past week, I've been trying to figure out how to do multiplayer in VR and UE (Unreal Engine). There are 2 ways I can do this.
I decided to go with Plan A before I'm going to use the template because I wanted to learn the software, UE. I've been watching and following Official UE tutorials online [ Link ]. Although they didn't cover the VR part, I think they did a very good job explaining the basics of how it works. Even though I understand maybe 20% of it all. But, I learned a lot from this tutorial.
After following the UE tutorials Link (Blueprint Multiplayer), I had this up and running. I managed to spawn 2 players from 2 PCs. One is a host and another is a client. They both used the same file.
And following more from the UE tutorials, I think I did something went wrong along the way because I'm stuck on the "Loading Page".... and I'm not able to spawn inside the scene anymore or I am spawned in the scene but the Loading screen won't disappear.
I've HIT a ROAD BLOCK....
So, let's move to Plan B --->
It took me the the whole day from 9am - 9pm to make this template work. There are so many things I tried and figured it out like:
Yayyyyyyy!!!!! In this video, you can see that they are looking in the mirror in VR and standing next to each other but in the real world they are actually behind each other. You can see more clearly when they're pretending to fight. This happened because I did not do the room set up properly and I point the Vive controllers to the monitors at different location.
Grab Objects
Macarena Dance
Network Connection MUST be itpsandbox
Unsuccessful Multiplayer Connection Test
The Left PC is the Client. The Right PC is the Host. This unsuccessful connection happened when I'm signed in to Steam account and not in the same itpsandbox LAN connection.
This week I've been busy with trying to send live data of touch gestures into UE(Unreal Engine). Trying to recognize mobile gesture input (tap & swipe (Left, Right, Top, Bottom)) while using a native social media app.
It is very extremely difficult to do this. I'm still a noob in coding skill and javascript and knowledge of mobile operating systems and devices (iOS & Android). Not to mention security issues.
After talking to many of my friends about my thesis nonstop for the whole week about this problem I have, bits and pieces of solutions seemed to slowly fit together. So for my performance piece, instead of the 2 users are using their phone through a native app, they would have to use their phone through a web browser like chrome. I believe I would need to implement these elements together:
Github - Code test touch gestures with server
Touch Gestures break down:
With Wippy's and Yuli's help, I'm able to get 5 touch gesture recognized from mobile through web server.
Touch gestures test + hammer.js + socket.io
Finding the IP address in windows (took me longer than I expected...)
Building on top of Demo 01, I was able to send one gesture in at the moment. I'm getting familiar with Unreal Engine's Blueprint to recognize all of them.
Touch gesture number 2 = pan right
I have to find a way to make chrome extensions or firefox add-ons on mobile. From my research, chrome extensions is NOT available for mobile....
Descriptions
Technology is changing our lives and impacting human behavior in ways we have never seen before. Our relationship with one another has changed as it is no longer bounded in one contiguous space but by the reach of our technology network. We experience this interconnectedness through numerous applications on our smartphones. We constantly stare, tap, and swipe without realizing the consequences and the implications of these actions. However, I believe the more we use our phone to stay connected, the less we are connected in the real world.
My project aims to focus on these mundane behavior and often normative behavior (tap, swipe, scroll) and attempt to translate these physical interactions into virtual parameters via Virtual Reality Technology. A parody of repeated physical interaction we do every day without thinking.
I wish to translate this cause and effect in VR that one tap means more than just a simple visual and sound feedback/response you see on screen. You don’t see it but our action/input with our smart devices, this electronically networks establish a new type of relationship to the widely scattered people and places we care about. Networks connect and propagate the effects of our action. Today we can do unto others at a great distance, and they can unto us. I’m attempting to visualize these hidden connections we all have and that smartphones make us closer to people they are far away from, but farther from the people they’re close to.
The project is a multi-performance and game-like storytelling experience of multiple users (2 users) where they are the creator of the virtual world. The experience requires each user’s physical input/interaction into virtual parameters via VR controllers (I want to use smartphones instead of VR controllers). At the beginning of the experience, they will physically sit next to each other and once they put on the VR headset, they will see a white world (a blank canvas) with their phones’ screens. The more they use (swipe, tap, scroll) their smartphone in VR, the more content (wireframes) will be generated and eventually they can no longer see each other in VR because they would feel a gradual shift away from each other during the game.
For now, I will focus on 2 people as my user subjects. They are friends and use their phone often. I will use a Moment App to collect and log data for one week. The Moment App tracks what apps my users used the most and how much time they spend connected. The VR environment content will depend on the what App my users used the most. I’m thinking of using existed UX/UI wireframes that procedurally created by user’s input(tap/swipe/scroll). After I’ve collected what App they used the most, the test user subjects will have to play my VR game too.
Personal Statement
As a trained Architect, I have always had a keen awareness of our surroundings and have always designed for the built world. Data and information gathered from physical systems and sensors help architects to design smart homes and eventually smart cities. I feel like architect’s role in the new digital age will not only shape, arrange, and connect spaces in the physical world to satisfy human needs. But we need to be able to speculate and be aware of the connected space in the virtual world too because networks have become part of our daily life.
In a rhizomatic way, humans have a symbiotic relationship with technology and similar to how every human is created uniquely. The relationship with our phones is also uniquely different. We are co-creating and uniquely tailoring our personal space in the digital world that is customized to our needs and interests.
"If we understand what is happenings, and if we can conceive and explore alternative futures, we can find opportunities to intervene, sometimes to resist, to organize, to legate, to plan, and to design”, William J Mitchell
The experience will start off with a white world.
___ To be continued ___
___ To be continued ___
___ To be continued ___
___ To be continued ___
I was inspired by the concept of a Never Ending Running game like "Minion Rush". I love this idea of the never-ending game because it's a metaphor of how we go through life day-in/day-out in a loop like an endless cycle of life. Wake up - Shower - Eat - Commute - Work - Eat - Commute - Eat - Shower - Sleep - REPEAT.
The content that will be constructed will be based on Touch Gesture Interactions of the users with their mobile phone (VR Controllers).
___ To be continued ___
___ To be continued ___
I will continue to update this calendar. I might change something a little bit here and there.
I came across this blog when I was researching on picking color palette for my experience. I find it quite useful.
I want my experience to start off with a white world like a blank canvas and keep a monochrome or limited tone. I'd like to keep it minimal and highlight main events for dramatic effect.
Possible Highlighted Color
Tools I used to pick color scheme are:
Only God Forgives
Sin City