Friday, May 22, 2020

The Study of Tiny Motions being amplified, some to the point that you can hear it.




Video Magnification

Many seemingly static scenes contain subtle changes that are invisible to the naked human eye. However, it is possible to pull out these small changes from videos through the use of algorithms we have developed. We give a way to visualize these small changes by amplifying them and we present algorithms to pull out interesting signals from these videos, such as the human pulse, sound from vibrating objects and the motion of hot air.

http://people.csail.mit.edu/mrub/vidmag/



Eulerian Video Magnification for Revealing Subtle Changes in the World
Abstract

Our goal is to reveal temporal variations in videos that are difficult or impossible to see with the naked eye and display them in an indicative manner. Our method, which we call Eulerian Video Magnification, takes a standard video sequence as input, and applies spatial decomposition, followed by temporal filtering to the frames. The resulting signal is then amplified to reveal hidden information. Using our method, we are able to visualize the flow of blood as it fills the face and also to amplify and reveal small motions. Our technique can run in real time to show phenomena occurring at temporal frequencies selected by the user.



Eulerian Video Magnification code
Matlab code and executables implementing Eulerian video processing for amplifying color and motion changes.

Phase Based Video Motion Processing code
Matlab code implementing the new and improved phase-based motion magnification pipeline.

Learning-based Video Motion Magnification code
Tensorflow implementation of the learning-based motion magnification pipeline.

Videoscope
Web interface for motion and color magnification. Upload your videos and have them magnified!

Some Creative License with the Capability of Mixed Technologies for Malicious Intent





"Slaughterbot" Autonomous Killer Drones | Technology



Perhaps the most nightmarish, dystopian film of 2017 didn't come from Hollywood. Autonomous weapons critics, led by a college professor, put together a horror show.

It's a seven-minute video, a collaboration between University of California-Berkeley professor Stuart Russell and the Future of Life Institute that shows a future in which palm-sized, autonomous drones use facial recognition technology and on-board explosives to commit untraceable massacres.

The film is the researchers' latest attempt to build support for a global ban on autonomous weapon systems, which kill without meaningful human control.

They released the video to coincide with meetings the United Nations' Convention on Conventional Weapons is holding this week in Geneva, Switzerland, to discuss autonomous weapons.

"We have an opportunity to prevent the future you just saw, but the window to act is closing fast," said Russell, an artificial intelligence professor, at the film's conclusion. "Allowing machines to choose to kill humans will be devastating to our security and freedom."

In the film, thousands of college students are killed in attacks at a dozen universities after drones swarm campuses. Some of the drones first attach to buildings, blowing holes in walls so other drones can enter and hunt down specific students. A similar scene is shown at the U.S. Capitol, where a select group of Senators were killed.

Such atrocities aren't possible today, but given the trajectory of tech's development, that will change in the future. The researchers warn that several powerful nations are moving toward autonomous weapons, and if one nation deploys such weapons, it may trigger a global arms race to keep up.




https://www.youtube.com/watch?v=ecClODh4zYk

When you need to Boot from a USB Device, Rufus is your Tool




Rufus
Create bootable USB drives the easy way
Rufus is the lightest fastest USB creating tool I have had the pleasure of working with.