- AInauten.net
- Posts
- 🤝 Be more productive with these two new assistants
🤝 Be more productive with these two new assistants
PLUS: “Black Mirror Glasses” recognize you at a glance ...
AI-HOI AInauts,
Welcome to the latest issue of your favorite newsletter - today with more practical insights into two of the most important flagship products from OpenAI and Google, and another Black Mirror episode that became reality 👓 ...
Here's what we have in store for you:
🤝 ChatGPT Canvas - how to use your new assistant
📝 Why you should use Google's NotebookLM (more often)
📸 Black mirror reality: How Ray-Ban smart glasses can digitally see through you at a glance
And in case you missed the most exciting Quick News, just check out the Weekend issue. Let's go!
🤝 ChatGPT Canvas - how to use your new assistant
OpenAI wants to make less news about internal drama and more with their progress. The new Canvas user interface has just been released last week, and it is the first major update since ChatGPT’s introduction two years ago! This makes interacting with the chatbot much more efficient, goal-oriented and collaborative.

When and how can you use ChatGPT Canvas?
Many compare Canvas to Anthropic's Claude Artifacts and Google's NotebookLM, but these tools each have their own unique abilities.
ChatGPT Canvas offers a collaborative workspace that is perfect for programming and writing, Claude Artifacts excels at creative app creation, and NotebookLM excels at managing large amounts of information.
A paid ChatGPT Plus or Team account is currently required; after the beta phase, it will also be available to free users. It is currently only available via the web and not yet on the mobile and desktop app.
The use itself is straight-forward: Simply select"GPT-4o with Canvas" in the model selector and off you go!
What can the new ChatGPT Canvas feature do?
It is an open workspace with a context-dependent user interface and detailed iteration options.
You have a separate editing window next to the chat interface
You can edit and refine text and code directly
Receive contextual suggestions and feedback from ChatGPT
When writing texts, you can easily adjust the length and reading level and then have it checked for grammar, clarity and consistency. And the world has been waiting for this: you can add emojis with one click! 😁
So Canvas is perfect for writing all kinds of content - you might soon be spending more time in ChatGPT than in Word or Google Docs.
The option to undo individual steps and use keyboard shortcuts is also super practical. Previously, you had to generate the entire output again and again in order to refine what the chatbot spit out.

Amateur programmers (or those who want to become one) will appreciate the coding features: Review code, add comments directly, debugging and even conversion between different programming languages! You can generate code of all kinds, from browser extensions to Replit.
Claude has definitely written a new chapter with the release of their Artifacts, which ChatGPT has taken inspiration from here. With Claude, you can develop your app, use it directly in the browser and even share it with others. ChatGPT doesn't go that far at the moment; you can only create and edit the code.
But unlike Claude, ChatGPT Canvas also has access to the Internet - a real helpful feature. Also, beware of it: "Prompt injections" are theoretically possible …

Pliny, the Latent Space Liberator, has already recovered the system prompt - and you can do it too. Why should you do this? Because it's super exciting to explore the "secret" instructions that control the behavior of the model.
What's next? Agents and multimodal AI, always on ...
OpenAI is still keeping quiet about what comes next. But we are happy to look into the crystal ball and spin up a few ideas.

via Giphy
It seems certain that we can expect agents next year. And that will be a real paradigm shift! These little helpers will be able to perform tasks for you independently. Until then, we will be introduced to this new world step by step by OpenAI and the others.
Now, that we also have Advanced Voice Mode available (free users now also receive 10 minutes per month), we can assume that the modalities will expand and that we will soon be able to use not only voice, but also images and video to give the chatbot more context.
It won't be long before we have a constant companion looking over our shoulder and proactively working with us ... and if we take these thoughts one step further, we get the feeling that the world could look very different in a few years from now ...
📝 Why you should use Google's NotebookLM (more often)
After the ChatGPT Canvas Deep Dive, we have to come back to NotebookLM. Is NotebookLM the ChatGPT moment for Google? It could very well be!
Even though the tool has been around for over a year, it's only now that we've really come to appreciate what it can do (also thanks to the multimodal Gemini 1.5 Pro model). We've experimented with it a lot - now it's your turn. Here are our best tips and use cases for you.

Google is often perceived as a sluggish juggernaut, but the NotebookLM team has taken a different approach and continues to develop the app together with its users. Feedback is continuously integrated, and Product Manager Raiza Martin is just as active on X as Editorial Director Steve Johnson - and even AI designer Jay Spiel has revived his old account.
Aside from TPUs running hot today, five things from Notebook HQ:
1) Thanks for all your feedback on AOs so far. I’m copy pasting everything into a Notebook so I can listen to a Deep Dive and search it later. We’re going to launch some immediate tweaks to make it less repetitive,… x.com/i/web/status/1…
— Raiza Martin (@raiza_abubakar)
6:20 PM • Sep 30, 2024
And because the team is so open and interacts with the users, we also know that they have some exciting things in the pipeline:
The viral podcast feature "Audio Overviews" works already in other languages but is still being tested for quality. It originally only took a few weeks from idea to prototype!
There are some resourceful users who have managed to produce French or Spanish versions, but apparently Google has already fixed this.
Other options such as format, length, personas, voices etc. are also being discussed.
New features such as "MagicDraft" and personalized chatbots (similar to "Gems") are being tested extensively.
User-specific chatbots are already widely used internally at Google and are massive productivity boosters.
Soon, a personalized writing workflow with agents could formulate your own point of view in your language, aided by research and notes.

Preview der “MagicDraft” Funktion via @testingcatalog
What are the best use cases for NotebookLM?
Huge context window: Did you know that you can have up to 50 different sources per notebook - and each source can contain up to 500,000 words = up to 25 million (!) words per notebook.
Your favorite podcast *in English: You can have any content twisted into an entertaining audio discussion. Upload an eBook you've always wanted to read (or the best book on any given topic). Or turn Lex's 4-hour interviews from your "Watch Later"-list into snappy summaries (yes, YouTube video transcripts and MP3/WAV uploads are now supported). Pro tip: give the two AI hosts instructions on what to talk about.
Podcasts on the go: You can open the NotebookLM website in your browser, but if you want to listen to the Audio Overviews in the background, it's best to download them or use an app such as solopod.net, which produces an RSS-feed that can be connected to Spotify or other tools.
Create a video podcast: You can also use the audio file as the basis for a video and animate it with your avatar (e.g. with HeyGen).
Studying and learning with AI: All kinds of study guides can be created from PDFs, notes or lectures (audio) and insights can be drawn from various sources. Scientific material and academic texts can also be made more easily accessible.
Archive everything that is relevant to you: You can "feed" everything that is relevant to you into a Google Doc (e.g. with a browser extension that you create with ChatGPT Canvas, or with ReadWise, or ...) - and once you attach the Google Doc to a notebook, it will update automatically!
Meeting summary and more: You can also upload a transcript or audio file (we love tl;dv!) and ask questions/get feedback. Instead of meeting minutes or management summaries, next time just send your manager an Audio Overview 😁.

P.S.: Sam Altman is a fan, Andrej Karpathy has recorded 10 podcast episodes about the "Histories of Mysteries" and Heygen CEO Joshua Xi has combined the audio with his new AI avatars. What are your ideas?
Much appreciated from the NotebookLM team @sama! Full circle in a way because it was my writing about GPT-3 that brought me to Google in the first place.
— Steven Johnson (@stevenbjohnson)
12:07 AM • Oct 2, 2024
📸 Black Mirror becomes reality: How Ray-Ban smart glasses can digitally scan you at a glance
Imagine you're walking through the city, and someone scans your face to find out your name, social media profile, phone number and even your address in a matter of seconds.
Ummm, that could theoretically happen to you! Welcome to the brave new world, powered by Meta Ray-Ban Smart Glasses and the ominous "I-XRAY" tool developed by two Harvard students.
Are we ready for a world where our data is exposed at a glance? @CaineArdayfio and I offer an answer to protect yourself here:
tinyurl.com/meet-ixray
— AnhPhu Nguyen (@AnhPhuNguyen1)
4:10 PM • Sep 30, 2024
One look is enough - this is how it works
Harvard students AnhPhu Nguyen and Caine Ardayfio have shown in a demo how freely available tools such as PimEyes and our beloved Ray-Ban Meta Smart Glasses can be used to get detailed information about strangers in a matter of seconds.
The glasses simply record a live video, which is streamed and coupled with facial recognition software. This searches public databases for images, names and even your address and reports it back to the user. Everything is fully automated, of course.
The two developers emphasize that they only developed the tool to create public awareness. They do not plan to publish it and also share tips on how to unsubscribe from such databases.
But honestly: there will probably be a copycat app on Github this week. We'll keep you posted …
We made it! But no need to be sad. The AInauts will be back soon, with new food for thought.
Reto & Fabian from the AInauts
P.S.: Follow us on social media - that motivates us to keep going 😁!
Twitter, LinkedIn, Facebook, Insta, YouTube, TikTok
Your feedback is essential for us. We read EVERY comment and feedback, just respond to this email. Tell us what was (not) good and what is interesting for YOU.
🌠 Please rate this issue:Your feedback is our rocket fuel - to the moon and beyond! |