Skip to main content

Google’s Call Notes feature could be getting even better

A woman holding a purple Google Pixel 9a.
Google

The Google Pixel 9 series introduced the Call Notes feature, which finally allowed users to record phone calls on their devices. Call Notes uses AI to summarize the main points of a call, generate a transcript, and provide an audio recording. Now the folks over at Android Authority have discovered another feature that could be coming soon.

Specifically, two lines of code suggest you could potentially and share full call transcripts. This would be particularly helpful for someone conducting interviews for research, or even for hiring managers that perform phone interviews. Users could quickly search for keywords and extract the most important information from a call.

Recommended Videos

They found the following code, linked to a function called “Fermat” — the codename for the Call Notes feature:

<string name="fermat_transcript_copy_button_description">Copy call transcript</string>
<string name="fermat_transcript_share_button_description">Share call transcript</string>

Even if you aren’t familiar with reading code strings, the intent here is clear. To be clear, call transcripts are available through the Pixel’s Call Screening feature — another immensely useful tool, especially if you’re frequently the target of robocalls — but this new sharing function is specifically for the data created via Call Notes.

The Google Pixel lineup has always been a powerful option, but Google has released a slew of features in the past several months that make it even better. For example, the company just launched a new camera feature that’s ideal for streamers. Combined with the utility of Google’s Gemini AI, the phone gets ever better and even offers several Pixel-only features. If you’re a student — or just someone who likes to learn new things — Gemini’s recent Deep Research feature is one you don’t want to miss. It can even turn complex topics into easy-to-consume podcasts.

And if recent rumors are to be believed, the upcoming Google Pixel 10 series will offer even smarter AI assistance.

Patrick Hearn
Patrick Hearn writes about smart home technology like Amazon Alexa, Google Assistant, smart light bulbs, and more. If it's a…
Thanks to Gemini, you can now talk with Google Maps
Gemini’s Ask about place chip in Google Maps.

Google is steadily rolling out contextual improvements to Gemini that make it easier for users to derive AI’s benefits across its core products. For example, opening a PDF in the Files app automatically shows a Gemini chip to analyze it. Likewise, summoning it while using an app triggers an “ask about screen” option, with live video access, too.
A similar treatment is now being extended to the Google Maps experience. When you open a place card in Maps and bring up Gemini, it now shows an “ask about place” chip right about the chat box. Gemini has been able to access Google Maps data for a while now using the system of “apps” (formerly extensions), but it is now proactively appearing inside the Maps application.

The name is pretty self-explanatory. When you tap on the “ask about place” button, the selected location is loaded as a live card in the chat window to offer contextual answers. 

Read more
Gemini app finally gets the world-understanding Project Astra update
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app with the camera feature open

At MWC 2025, Google confirmed that its experimental Project Astra assistant will roll out widely in March. It seems the feature has started reaching out to users, albeit in a phased manner, beginning with Android smartphones.
On Reddit, one user shared a demo video that shows a new “Share Screen With Live” option when the Gemini Assistant is summoned. Moreover, the Gemini Live interface also received two new options for live video and screen sharing.
Google has also confirmed to The Verge that the aforementioned features are now on the rollout trajectory. So far, Gemini has only been capable of contextual on-screen awareness courtesy of the “Ask about screen” feature.

Project Astra is the future of Gemini AI

Read more
The Google Pixel 9a is missing a crucial life-saving feature
Person holds Pixel 9a in hand while sitting in a car.

Launched earlier this week, the Pixel 9a packs Pixel 9's magic in a condensed, more affordable form. With a lower price, though, the Pixel 9a loses some perks of the more premium Pixel 9 phones, including some defining Gemini AI features.

Despite running the same Tensor G4 chipset as the Pixel 9 and the 9 Pro, the Pixel 9a loses on a life-saving feature: satellite-based SOS functionality. Google confirmed the lack to Android Authority, validating you won't be able to call or text in cases you get stranded without network -- as you would be able to with the Pixel 9, 9 Pro, and the 9 Pro Fold phones. Google offers these services on premium Pixel phones in the U.S., Canada, UK, and parts of Europe.
Why Pixel 9a lacks satellite SOS connectivity
The reason behind Google skipping satellite connectivity from the Pixel 9a is an older modem. Unlike the Pixel 9 series, which uses the newer and more advanced Samsung Exynos 5400 modem, the 9a uses the previous generation Exynos 5300. The newer modem is what enables 5G non-terrestrial networking (NTN) that is used to connect with low-orbit communication satellites when cellular network is absent.

Read more