Some comic relief for your Friday…
The Hardest Lies to Spot?
Janine Driver aka “The Human Lie Detector” recently appeared on the Dr. Drew show in regards to Andrea Sneiderman’s trial.
Sneiderman was charged with her husband’s murder. He was shot and killed by her boss, but she has plead not guilty to the 16 charges that were filed against her.
Since then the murder and more aggressive charges that were filed have been dropped for lack of evidence. Janine weighed in with her thought’s regarding this murder case.
Driver told HLN Network that the hardest lies to spot,
“are the ones where there’s partial truth an partial lie. Unfortunately, we are worse at detecting deception today than we used to be…Because our brain has its very old instinct. It is a very sensitive danger detection system. Because we don’t want to be seen as paranoid, we override signs of potential deception and we put ourselves in a dangerous situation.”
Driver is a body language trainer who offers seminars and workshops on body language and deception detection via her two companies: Lyin Tamer and the Body Language Institute. She also has a New York Times bestseller: You Can’t Lie to Me and has appeared on Dr. Drew’s show before commenting on the very famous Jodi Arias murder trial.
Freaky Facial Expressions
The realism of robotic facial features is growing at an exponential pace. Huff Post Science has reported on a very “interesting” way to control an android’s face.
Researchers at the University of the West of England in Bristol programmed the robotic face, in the video below, to respond to electrical signals produced by slime mold, which is a fungus- like organism that resembles spongy, yellow blobs.
How Does This Work?
When the mold moves toward food, the bot registers a positive expression and when it recoils from light, the robotic face looks downcast.
The university’s Dr. Ella Gale, a research associate in unconventional computing, placed the mold in a small dish on a bed of 64 electrodes to create the robot-mold interface. The electrodes pick up tiny signals from the mold and route them to the robot to produce facial expressions.
Watch the video below to get a better feel of how mold controls the Robotic “mind”.
Click here to view the embedded video.
- « Previous Page
- 1
- …
- 164
- 165
- 166
- 167
- 168
- …
- 273
- Next Page »