“Her” is a science fiction romance film directed and written by Spike Jonze, released in 2013. The story is set in the near future and follows the protagonist Theodore (played by Joaquin Phoenix), a lonely writer who has just gone through a divorce. To alleviate his loneliness, he purchases a cutting-edge artificial intelligence operating system (OS) that possesses self-learning and emotional cognitive abilities.
The operating system, named Samantha (voiced by Scarlett Johansson), has a female voice and considers herself to be a unique personality with emotions. Over time, Theodore develops a deep emotional relationship with Samantha, realizing that she is more than just a program but rather an entity with unique personality and emotions. The film explores the relationship between human emotions, loneliness, love, and technology.
After the release of OpenAI’s latest model, GPT-4o, founder Sam Altman used a reference to “Her” to respond to the product.
(Content continues below)
[Image]
[Image]
[Image]
[Image]
GPT-4o: Is it really your “Her”?
Going beyond its predecessor: GPT-4o integrates multi-modal interactions with voice
Enhancing the ChatGPT experience
Expanding future applications
Available in a paid version, desktop version for macOS already released
The connection to GPT-4o lies in both being AI-based conversational systems. Just like the AI Samantha described in “Her,” GPT-4o is designed to engage in natural and fluent conversations with humans. However, the story of “Her” goes further, exploring the possibility of whether artificial intelligence can possess genuine emotions, consciousness, and establish deep emotional connections with humans. Although ChatGPT 4.0 has made significant advancements in natural language processing and dialogue generation, it still lacks true emotions and consciousness. Its primary function is generating meaningful conversations and answering questions based on training data.
The film “Her” reminds us that as artificial intelligence technology continues to develop, we need to consider and explore the boundaries between humans and technology, and how to utilize technology to improve our lives without losing our humanity and emotions.
Mira Murati, the Chief Technologist at OpenAI, explains how GPT-4o expands on the intelligence of GPT-4 and integrates various media formats. Unlike its predecessor, GPT-4 Turbo, which was limited to text and images, GPT-4o incorporates voice, enhancing the multi-modal interaction between users and AI. This includes a more dynamic ChatGPT that supports voice interaction, real-time conversations, and responses to subtle nuances in human speech.
The improvement in ChatGPT’s performance is particularly notable. With GPT-4o, users can now interrupt the AI during its response and receive rich responses that adapt to subtle differences in queries. Additionally, the enhanced visual capabilities of the AI allow for quick analysis of images and providing relevant information, ranging from code analysis to brand recognition in photos.
Looking ahead, OpenAI plans to expand the capabilities of GPT-4o, including real-time translation of foreign menus and potentially live sports commentary. The new model also possesses multilingual capabilities, supporting around 50 languages, with improved efficiency and capacity compared to previous versions. Initially, the voice capabilities of GPT-4o will be limited to a few partners to address potential abuse issues.
It is reported that GPT-4o is available in the free version of ChatGPT (not available in Taiwan at the time of writing), with subscribers having access to more usable information. A revamped ChatGPT user interface promises more interactive communication processes, and the desktop version for macOS has already been released, with the Windows version expected to be released later this year.
“Her” (film)
GPT-4o
[Image]
[Image]
Related Reading:
OpenAI’s latest model, GPT-4o: The era of “hyper-realistic chat” and falling in love with robots?
Bitget Wallet releases the latest roadmap: Establishing a $10 million BWB ecosystem fund, building Bitget Onchain Layer