My AI powered NPC Meta Quest 3 mixed reality project

interesting, im trying to decied if its reading it live or picking up information from another source.

my 2 random thoughts are Bigscreen beyond group screening and VRChat (has a number of possibilites like chaperone / safety companion / auto ban ppl ect)

its neet though, she seems to know were you are in the movie after a few seconds of thought/analysis
She's reading it live but I have to admit she's not doing video-to-text analysis, I don't think that is available at the moment or atleast not publically. It'll work for any movie etc and it adjusts to whatever is being watched.

I'm trying to not reveal how it's done as I'm having a bit of banter with Patreon members on working out how the app does it, will reveal all soon of course, I'll have to when it is released as part of the app anyway as it requires a file being downloaded, there's the clue!
 
no no thats fairs completly. appologies
No need for apologies dude! It's basically using the subtitle file for the movie, the avatar times the subtitles with where the movie is up to, and comments on what they think is going on at randomly chosen times or when they are asked to comment. Been watching movies today with it, it's not bad but far from perfect, some things they say are spot on, some things they are off the mark but it's easy to put them back on the right track.
 
Subtitles are good for the dialogue but often will lose the context.

I'm not sure how available things are but what about audio descriptions that are provided for visually impaired users? Might be more context in that to guide the avatar.
 
Subtitles are good for the dialogue but often will lose the context.

I'm not sure how available things are but what about audio descriptions that are provided for visually impaired users? Might be more context in that to guide the avatar.
Good idea and I did think of that but couldn't find anything of real value that was publically available tbh, I looked for them for Blade Runner 2049, but didn't look extensively, will look again.

Edit : Just found a site with audio description files actually, and realised it will probably be too slow to be of use or it'd too much for the Q3 to deal with with doing everything else at the same time. Unless I use the new chatgpt voice input mode which is really expensive, audio description files have to be transcribed to text first so that they get fed to chatgpt to process, can't do that in good time unfortunately. If I could get a text transcription of the audio description files then that would be great, will keep looking.
 
Last edited:
What would happen if you 'said' you had to leave the room for five minutes. What would the NPC do.
Tape over the proximity sensor and put the headset down, so no breathing or movement would be detected. Would the AI get bored, start humming, rifle through your collection of vintage pron magazines. Just curious.
 
Last edited:
What would happen if you 'said' you had to leave the room for five minutes. What would the NPC do.
Tape over the proximity sensor and put the headset down, so no breathing or movement would be detected. Would the AI get bored, start humming, rifle through your collection of vintage pron magazines. Just curious.
Lol! Interesting question! They are coded to eventually initiate conversation if nothing is being said tbh. Maybe I should code them to lie down for a good bit of shuteye if there's a long pause detected.

throw all your belongings on the lawn , boil your rabbit and scream ITS OVER!!!
Eurgh I've had a GF who for things like that weren't beyond being impossible, makes me shudder a bit tbh, what I was thinking, I can only say I wasn't sober whenever I was with her.
 
Last edited:
Lol! Interesting question! They are coded to eventually initiate conversation if nothing is being said tbh. Maybe I should code them to lie down for a good bit of shuteye if there's a long pause detected.
instead of shut eye maybe "meditation" would work better, leads on to guided meditation convasation or help.
 
Back
Top Bottom