Site icon TechRound

Google Is Able To Live View Your Screen. Is This Safe?

Internet users are reporting that Google has started adding camera and screen-reading features for Gemini Live. This action was confirmed in a statement from spokesperson Alex Joseph to The Verge. The software processes real-time input from your phone display or lens, answering queries on the spot.

Gemini Live can study an application open on your device and respond instantly. A Reddit user displayed this function on a Xiaomi phone, and individuals online have shown similar recordings. The system parses words or pictures as they appear, leading to prompt explanations.

A second component relates to live video through your camera. In one clip published this month, a person aimed the lens at newly-glazed pottery and asked Gemini for paint ideas. The assistant gave colour guidance in real-time, without a lag.

These camera functions come from work that Google previewed in 2022 under the label Project Astra. Gemini Live now brings both features to advanced plan subscribers, opening up a different style of phone-based assistance. A report from 9to5Google states that the rollout is underway, though it might be gradual.

Menaka Shroff, Senior Director for Global Android Marketing announced earlier this month, “We’re also showing new live video and screen-sharing capabilities in Gemini Live, which will start rolling out to Gemini Advanced subscribers as part of the Google One AI Premium plan on Android devices later this month.”

 

When Did This Start?

 

Signs of this upgrade showed up on Reddit when a user saw a new button marked “Share screen with Live.” That feature appeared above the usual query bar, allowing full visibility to whatever was on the phone display. The same person also noticed a camera toggle in the Gemini Live menu, which gives quick video input.

Another user posted footage of the system reacting in real-time as they navigated different apps. Apparently, the tool handled text detection and object recognition smoothly. Google One subscribers who pay for Gemini Advanced stand to receive the same functions.

Google first announced these plans in early March, stating a March release for advanced subscribers. Evidence from phone users shows that it is now active. No exact timeline has been posted, and each device may get it at a different pace.

 

 

Who Gets It and What is Different?

 

Gemini Live sits within Google’s assistant framework on Android. The thing that stands out is a continuous look at your screen or camera feed, making day-to-day tasks more direct. This scheme departs from older software that required snapshots or single images.

One demonstration video captured a user testing paint colours for a newly-glazed bowl, with Gemini giving shade tips. Another example showed the assistant reading text while the device owner scrolled through messages. The processing took place in real-time, without halts.

 

What About Privacy?

 

Gemini Live handles recordings and transcripts under the Gemini Apps Privacy Notice. If Gemini Apps Activity is switched on, these details are stored in that record. It is possible to manage or delete them whenever you like.

When Gemini Apps Activity is off, chats are kept in your Google Account for up to 72 hours. This setup supports service maintenance and feedback. If that setting is on, text logs may be reviewed to improve Google AI.

Live audio files do not go through that review right now. Google says it will make an announcement if that approach changes at a later stage.

There is also a reminder to respect privacy rules. Ask permission before you record others or bring them into a Live session. That practice helps keep everything transparent for everyone involved.

Exit mobile version