Apple Vision Pro vs Quest 3; ChatGPT has memory now.

Tech and AI Developments in a Glance

🌐Tech and AI Developments in a Glance

🧠 ChatGPT is getting a digital memory

🥽 Zuckerberg says Quest 3 is better than Vision Pro in every way

💬 Slack is getting a major Gen AI boost

🤖 Sam Altman warns that 'societal misalignments' could make AI dangerous

💻 Nvidia unveils tool for running GenAI on PCs

This week’s biggest news is the launch of Text to Video model by OpenAI - Check out Sora. Just over a year ago, ChatGPT changed the way we interact with AI. In this short time, we have got new LLM models, Text to Image models got better and now check out how we can generate a video by providing a few lines of text. The samples are mind-blowing. We have come a long way and possibilities are limitless.

🧠 ChatGPT is getting a digital memory LINK

  • OpenAI is introducing a "memory" feature for ChatGPT to remember users' preferences and conversation details, aiming to personalize interactions more effectively.

  • The memory feature can learn specifics about a user either through direct instruction or by gradually picking up details from interactions, with each customized GPT model maintaining its memory for more personalized service.

  • While the memory feature addresses the need for a more personalized AI experience, it raises privacy concerns, however, OpenAI offers options for users to manage what ChatGPT remembers and includes a Temporary Chat mode for more private conversations.

🥽 Zuckerberg says Quest 3 is better than Vision Pro in every way LINK

  • Mark Zuckerberg, CEO of Meta, stated on Instagram that he believes the Quest 3 headset is not only a better value but also a superior product compared to Apple's Vision Pro.

  • Zuckerberg emphasized the Quest 3's advantages over the Vision Pro, including its lighter weight, lack of a wired battery pack for greater motion, a wider field of view, and a more immersive content library.

  • While acknowledging the Vision Pro's strength as an entertainment device, Zuckerberg highlighted the Quest 3's significant cost benefit, being "like seven times less expensive" than the Vision Pro.

💬 Slack is getting a major Gen AI boost LINK

  • Slack is introducing AI features allowing for summaries of threads, channel recaps, and the answering of work-related questions, initially available as a paid add-on for Slack Enterprise users.

  • The AI tool enables summarization of unread messages or messages from a specified timeframe and allows users to ask questions about workplace projects or policies based on previous Slack messages.

  • Slack is expanding its AI capabilities to integrate with other applications, summarizing external documents and building a new digest feature to highlight important messages, with a focus on keeping customer data private and siloed.

🤖 Sam Altman warns that 'societal misalignments' could make AI dangerous LINK

  • OpenAI CEO Sam Altman expressed concerns at the World Governments Summit about the potential for 'societal misalignments' caused by artificial intelligence, emphasizing the need for international oversight similar to the International Atomic Energy Agency.

  • Altman highlighted the importance of not focusing solely on the dramatic scenarios like killer robots but on the subtle ways AI could unintentionally cause societal harm, advocating for regulatory measures not led by the AI industry itself.

  • Despite the challenges, Altman remains optimistic about the future of AI, comparing its current state to the early days of mobile technology, and anticipates significant advancements and improvements in the coming years.

💻 Nvidia unveils tool for running GenAI on PCs LINK

  • Nvidia is releasing a tool named "Chat with RTX" that enables owners of GeForce RTX 30 Series and 40 Series graphics cards to run an AI-powered chatbot offline on Windows PCs.

  • "Chat with RTX" allows customization of GenAI models with personal documents for querying, supporting multiple text formats and even YouTube playlist transcriptions.

  • Despite its limitations, such as inability to remember context and variable response relevance, "Chat with RTX" represents a growing trend of running GenAI models locally for increased privacy and lower latency.