You’ve seen the world’s fastest camera drone trying to chase down Max Verstappen, now check out the Formula 1’s new gyro cam mounted on Lando Norris’ McLaren car. What you’re about to see is a practice lap at the 2024 Dutch Grand Prix this past weekend.
Robosen’s massive Transformers Megatron robot isn’t just an action figure, as this thing can auto-convert to a tank at the push of a button. Voiced by Frank Welker, this it’s touted as the world’s first dual-form dipedal walking robot, complete with 112 ultra-bright LED lights that illuminate Megatron with stunning blue and violet accents.
There’s the Agfa ePhoto CL30, and then this Panasonic PV-SD4090 digital camera from 1999 that uses SuperDisk media to store photos. This device is capable of capturing 1,280 x 960 resolution JPEG images, while its time-lapse mode can be used to take an image once every, 1, 5, 10, 30 mins as well as 1, 6, 12, 24 hours.
Astronomers used Hubble Space Telescope data to visualize what starburst galaxy NGC 1569, located 10.96 million light-years from Earth in the constellation Camelopardalis, could sound like. This star factory creates them at a rate 100 times faster than in our own galaxy, the Milky Way.
Sure, the Super Mario 64 PC port may have been taken down, but this fan-made Super Mario and the Rainbow Stars game has 7 chapters and still available…for now. Put simply, it attempts to combine the best of Paper Mario as well as the Mario and Luigi series into a side-scrolling 2D platformer.
Carnegie Mellon University (CMU) researchers unveil Picotaur, a fingertip-sized micro robot that can not only run and turn, but also play soccer…sort of. Its legs are driven by multiple actuators so it can achieve various locomotion capabilities.
KRAFTON’s inZOI life simulation game, powered by Unreal Engine 5, lets players alter any aspect of their world to create unique stories and experiences, just like in the real world. You’ll be able to search for jobs to make a living all the while forming deep relationships through interactions.
Disney research engineers have made it possible for a robot to learn policies from unstructured motion data. First, they extracted a latent space encoding by training a variational autoencoder and then taking short windows of motion from unstructured data as input.