If you use the YouTube app on your big screen television set, you may have noticed a couple of changes in recent days: a new sound and animation that appears when the app starts up, and the option to display comments alongside what you're watching.
The sound is official, and YouTube has gone into great detail about it in a blog post. Something "vibrant, engaging and easily recognizable" was required, and so YouTube enlisted the help of sonic branding studio Antfood to get the audio snippet just right.
According to YouTube, the three-second clip goes from "rich, pitch-bending tones that signify the irresistible gravitational pull of YouTube" to a major 7th chord that "represents the way YouTube allows you to explore the things you really love" – and the sound and animation will apparently be appearing in more YouTube apps over time.Leave a reply in the comments
The other change to the YouTube app for TVs isn't official but has been spotted on Reddit (via Android Police), and gives you the option to view comments. Here at TechRadar we've also seen the feature pop up in YouTube on an Android TV.
While viewing videos, you get the choice of displaying comments in a sidebar on the right. It might come in handy for those videos that have a lot of discussion below the line that you want to check out, but you still need to fire up the mobile app if you want to respond to comments or add your own.
As yet, YouTube hasn't acknowledged that the comments feature is in testing, but it's clearly visible for some users out there. Whether or not it eventually gets rolled out to everyone using YouTube on a TV remains to be seen.Analysis: keeping eyeballs on YouTube
Most changes to YouTube and apps like it are to keep more eyeballs on the app for longer – which of course drives engagement and advertising revenue. These most recent updates may not seem hugely significant, but they could still make quite a difference.
The Netflix start up sound and animation is so well-known that the annual Netflix content showcase is named after it. YouTube will be hoping that its own intro clip becomes just as familiar to viewers, and becomes as famous as its static logo.
Adding comments is going to make more of a difference to the actual viewing experience, as it's here that a lot of key discussion and debate around a video goes on. With some clips, the comments are just as interesting as the actual content.
It's worth emphasizing that the comments can be toggled on and off, at least based on our testing – you're not going to have to wade through them all if you don't want to. We'll have to wait and see what YouTube says about this switch officially when the time comes.
An upcoming update to Microsoft Teams is looking to help the hard of hearing stay better engaged in online meetings.
Similar to a recent Zoom update, the new Sign Language View feature allows Microsoft Teams users to choose up to two other video feeds to be centered in the app, making sign language interpreters much more visible throughout the whole meeting.
Microsoft says that the feed becomes larger than the others and stays at a high resolution for the clearest view possible, and is client-side only, so users will be the only one who can see the view, as other participants will each have their own. If a someone shares content during the meeting, the signer’s video will be moved to the side along with everyone else, but will maintain a bigger aspect ratio. Other participants can still be spotlighted and it won’t take space away from the interpreter, either.Enabling the view
Sign Language View will arrive on Microsoft Teams for both desktop and browser sometime in December, according to an entry on the Microsoft 365 roadmap.
Users can join the Teams' Public Preview program to try out the feature; however, be aware the preview is being done on a per-user basis, and if you are one of the lucky few, Microsoft has a set of instructions on how to turn on the view.
Sign Language View can be enabled for all meetings or on a case-by-case. Interpreters who work in the same company as you can be pre-assigned before a meeting, something you can do via the Settings menu. This way, when you enter a meeting with an interpreter, the view will already be activated.
Signers can be added mid-meeting with the “Manage signers” button found on the new Accessibility pane. Clicking the button allows you to designate a participant as an interpreter just by typing in their name. And through the pane, you can toggle both the Sign Language View and Live Captions mid-meeting, as well.Growing Teams
Microsoft has been churning out tons of new features for Teams in recent months and it’s a little tricky to keep track of it all. For example, the company recently implemented games into the platform as a way to build camaraderie between team members. You have your basic titles like Solitaire and Minesweeper, but also more interactive games like Kahoot.
As for what's coming next, we recommend looking through TechRadar’s coverage of future additions coming to Teams. There’s quite a lot. First, the platform is slated to get a performance boost, although it's unknown when exactly. And a Premium version of Teams will be entering its first preview in December 2022, adding AI to help transcribe meetings in 40 different languages and “advanced security features,” among other things.
- These are the best video conferencing services around
It’s not all about new features, though, as it also brings a huge update to iOS 15’s excellent Live Text feature that arrived last year.
The clever feature became incredibly useful since its inception, letting users grab phone numbers, addresses, and plenty more from images. The good news is that it's now even better – so with this in mind, here’s what’s new for Live Text, alongside how to use it for your iPhone in iOS 16.What is Live Text and what’s new in iOS 16?
(Image credit: Future)
Live Text, in its simplest form, lets you manipulate text found within an image. For example, whether you’ve taken a photo of a restaurant menu or took a photo of a business card so you won’t forget it, Live Text will let you copy the text out and use it to send a message, create a note, use an email address, or make a call.
This is done using on-device intelligence, but that does mean that you’ll need an iPhone XS, XR or later.
It’ll also work on iPadOS 16.1, released in October 2022, so you can use the feature across your devices when needed.
However, if you have an iPhone capable of running iOS 16, the headline new feature for Live Text this year is the addition of pulling text from video content.
The idea is that if you pause, for example, an educational video, and need to pull down some of the data to add it to your notes, you can do so by pausing the video and highlighting the text.
Another big new feature is the inclusion of 'Quick Actions' for Live Text. This allows you to take a photo of text in another language, for example, and instantly translate it.
Other quick actions available include starting an email when finding an email address, calling phone numbers, converting currency and plenty more.How to use Live Text in the Camera app
(Image credit: Future)
In the Camera app of iOS 16, hold your device so that the text is clearly visible on your screen and tap the icon that looks like a barcode scanner in the corner of the viewfinder.
Doing so brings up the Live Text overlay, and lets you highlight parts of the text shown. You can also tap the 'Copy All' Quick Action in the corner, which will copy all of the text to your clipboard so you can paste it into an app immediately.
(Image credit: Future)
In the example above, you can see that Live Text gets a little muddled thinking the image is upside down, but copying the text will, naturally, revert it to the correct orientation.How to use Live Text in the Photos app
(Image credit: Future)
The same process applies to images and videos in the Photos app. Simply tap the aforementioned “Barcode” button and you’ll get the option to “Copy All” or highlight individual parts of the text.
This can be particularly helpful if you haven’t got time to hover and copy down some information – simply grab a photo or video, and come back to it in your own time.How to use Live Text in the Translate app
(Image credit: Future)
One of the big additions in iOS 16 is being able to access Live Text through the impressive Translate app. If you’ve not used the app before, you’ll find it in your App Library, or via a Spotlight search.
Once you’re in, there’s a new option at the bottom of the screen to open the camera. It won’t translate in real-time insofar as you’ll need to take a photo, but the app does work incredibly quickly.
You can pick the language after the fact, too, meaning if you need to translate a phrase into multiple languages, you can do so with ease.