• Sentient
  • Posts
  • 🤖 Is Google still “Code Red”?

🤖 Is Google still “Code Red”?

PLUS: ChatGPT 🤝 South Park

GM and Happy Wednesday. Welcome back to Sentient.

In Today’s Email:

  • Is Google still “Code Red”? 🚨

  • ChatGPT 🤝 South Park

  • AI on Mars 🪐

Is Google still “Code Red”? 🚨

Remember Google’s “Code Red” situation from a couple of months ago?

Yesterday, Google and the Technical University of Berlin published research on their new LLM, PaLM-E 562. The multimodal language model incorporates robotics, vision, and text. It’s also trained on 562 billion parameters. 🤯

Check out this video of the PalM-E model instructing a robot to grab a chip bag.

This entire process was completed without a pre-processed scene, instead, data from the robot’s camera was fed into the PaLM-E model for fully autonomous robotic control.

Google’s work on their largest vision transformer ever, ViT-22B, has also been incorporated into PaLM-E 562. ViT-22B has been trained on a variety of different visual tasks including image captioning, object detection, and image classification. We discussed the ViT-22B in greater detail here.

PaLM-E is one of the most advanced deep-learning models developed and provides huge improvements in “multimodal” artificial intelligence.

Check out the research on PaLM-E here.

ChatGPT 🤝 South Park

ChatGPT has officially gone mainstream.

We’ve seen ChatGPT on the cover of Time.

This time, ChatGPT will be appearing on an episode of South Park. Check out this preview.

😂

Clyde made us curious, we tested out how good ChatGPT’s game was. Here are the results.

Ngl, these suck.

South Park’s new episode “Deep Learning” will premiere tomorrow, March 8th at 10 pm ET on Comedy Central.

AI on Mars 🪐

AI might be exactly what we needed to find life on the red planet.

A group of researchers at the SETI Institute NASA Astrobiology Institute (NAI) recently released a study discussing how ML could be used to find life hiding on Mars.

The researchers conducted their study using an area called Salar de Pajonales in Chile. The location is inhospitable to life, however, it’s not uninhabitable (perfect). A combination of drone footage, samples, and images to map out life distribution in a sparsely-inhabited region. This data was then used to train an AI model to identify areas that had a high possibility of hosting life.

How’d it turn out? Good, really good. After the AI learned what it needed to identify the collected statistical ecological data was combined with the AI model to find signs of life 85% of the time.

How’d the team of researchers do without the help of AI? Bad, really bad. The team spotted biosignatures less than 10% of the time. 🤯

Check out the full press release here.

That’s it for today! We’ll see you back here tomorrow!