Educators, speech therapists and healthcare workers who use Skype, will like this news.
Skype has developed its Translator beta tool to remove spoken-language barriers between different nationalities (and shared-nationalities, in the case of people with hearing issues).
English and Spanish are the first two languages supported by Skype Translator, which uses machine-learning to achieve smarter outcomes for the language-translation tasks required.
How Will Users Benefit?
1. When two people talk via Skype in two different languages, the speech recognition engine bot works as an ‘intelligent’ third person, and translates their chat as needed.
2. Users’ conversations will appear in chat segments on the right-hand side of the Skype screen, with instant messaging in 40 languages in the pipeline (Windows 8.1 devices).
3. Families using Skype for speech therapy services via telepractice have the text as a reminder as a child completes a verbal instruction or conversational exchange in a session.
4. School children and students Skyping each other in two different languages to learn conversational skills, have a text back-up to see the conversation thread as they talk.
5. Recruitment candidates who’re deaf and interviewing via spoken conversation in Skype, will have text cues for what’s said during screen time with one or more interviewers.
6. Online meetings with multiple attendees can be held on the Skype for Business platform with instant messaging, telepresence, document sharing and additional services.
The Technical Details
As users converse via the Translator system, the machine learning software will use speech recognition to file their contexts to achieve a smarter product, the more it is used.
Data sources for building the Translator’s beta speech recognition capabilities include video with captions, one-on-one and translated conversations and web pages. Time will tell how effective the platform is, with voice technology at a tipping point in 2015 specifically.