Chewing tobacco riskier than smoking: Study


Chewing tobacco poses a bigger threat than smoking, according to a study conducted by a group of city oncologists. Ahead of the No Tobacco Day on May 31, data compiled by the experts suggest that more than half of the city's tobacco-induced cancer patients are gutkha consumers, rather than smokers. Perhaps even more alarmingly, the average age of patients suffering from head-and-neck cancer - generally triggered by continuous tobacco use - has gone down to 25 years.

Conducted by Bengal Oncology, the study reveals that the share of head-and-neck cancer could drop to less than 20% from the present 45% of all cancer-affected people in Bengal if chewing tobacco could be prohibited. Even though gutkhas have been banned in the state, sale hasn't stopped. The figures also show that the number of tobacco-chewers is rising in the city faster than the number of smokers

Probiotic yogurt may change brain function



They found that, compared with women who didn't consume the probiotic yogurt, those who did showed a decrease in activity in both the insula - which processes and integrates internal body sensations, like those from the gut - and the somatosensory cortex during the emotional reactivity task.
Further, in response to the task, these women had a decrease in the engagement of a widespread network in the brain that includes emotion-, cognition- and sensory-related areas. The women in the other two groups showed a stable or increased activity in this network.
During the resting brain scan, the women consuming probiotics showed greater connectivity between a key brainstem region known as the periaqueductal grey and
cognition-associated areas of the prefrontal cortex.
The women who ate no product at all, on the other hand, showed greater connectivity of the periaqueductal grey to emotion- and sensation-related regions, while the group consuming the non-probiotic dairy product showed results in between.

The robot butler that can tend to your every need - even predicting when you want a beer AND pouring it for you

  • The robot, developed at Cornell University, uses Kinect sensors, 3D cameras and a database of videos to work out what its owner wants
  • In tests, the robot correctly anticipated its owner's needs 82% of the time
A beer-pouring robot that can read your body movements and anticipate when you want another drink has been developed by American students.
Researchers from Cornell University used Microsoft Kinect sensors and 3D cameras to help the robot analyse its surroundings and identify its owner's needs.
The robot then uses a database of videos showing 120 various household tasks to identify nearby objects, generate a set of possible outcomes and choose which action it should take - without being told.
Scroll down for video
A robot developed by researchers from Cornell University uses Kinect sensors, 3D cameras and a database of household task videos to anticipate their owner's needs.
A robot developed by researchers from Cornell University uses Kinect sensors, 3D cameras and a database of household task videos to anticipate their owner's needs. For example, it scans the surrounding area for clues and when it spots an empty beer bottle, can open the fridge, pick up a full bottle of beer and hand it to its owner - without being told
Robot depth view
The robot can anticipate its owner's actions
The Cornell robot uses sensors and a 3D camera to analyse the depth of its surroundings (left). The view seen by the robot in the right-hand picture shows how it anticipates its owner's actions. It compares the actions against a database of household task videos and chooses what it thinks is the most appropriate response. The more actions the robot carries out, the more accurate its decisions become

BEER DRONE WILL DELIVER DRINKS TO FESTIVAL GOERS FROM ABOVE


Festival-goers in South Africa this summer will be able to order beer from their smartphones and have it delivered by a flying drone dropping a can attached to a parachute.
The drone has been developed by Darkwing Aerials and will be tested at the Oppikoppi music festival in the Limpopo province of South Africa this August.
Customers will be able to place their drink orders through an iOS app that will send their GPS coordinates to the drone operators.  
As the actions continue, the robot can constantly update and refine its predictions. 
As well as fetching drinks for thirsty owners, the robot can also work out when its owner is hungry and put food in a microwave, tidy up, make cereal, fetch a toothbrush and toothpaste, open fridge doors and more.
 
Ashutosh Saxena, Cornell's professor of computer science and co-author of a new study tied to the research: 'We extract the general principles of how people behave.
'Drinking coffee is a big activity, but there are several parts to it.
'The robot builds a 'vocabulary' of such small parts that it can put together in various ways to recognise a variety of big activities.'
The Cornell robot can also help its owner tidy up.
The Cornell robot can also help its owner tidy up. In this image, the robot scanned the area and noticed that its owner was carrying a pot of food and heading towards the fridge. The robot then automatically opened the fridge door. During tests, the robot made correct predictions 82% of the time when looking one second into the future, 71% correct for three seconds and 57% correct for 10 seconds
The robot was initially programmed to refill a person’s cup when it was nearly empty.
To do this the robot had to plan its movements in advance and then follow this plan.
But if a human sitting at the table happens to raise the cup and drink from it, the robot was put off and could end up pouring the drink into a cup that isn’t there.
After extra programming the robot was updated so that when it sees the human reaching for the cup, it can anticipate the human action and avoid making a mistake.
During tests, the robot made correct predictions 82 per cent of the time when looking one second into the future, 71 per cent correct for three seconds and 57 per cent correct for 10 seconds.
This image shows the robot anticipating its owner walking towards a fridge and automatically opens the fridge door for him.
This image shows the robot anticipating its owner walking towards a fridge and automatically opens the fridge door for him. The first three images show the robot's view, the fourth is from the view of the owner
'Even though humans are predictable, they are only predictable part of the time,' Saxena said.
'The future would be to figure out how the robot plans its action.
Right now we are almost hard-coding the responses, but there should be a way for the robot to learn how to respond.'
Saxena and Cornell graduate student Hema S. Koppula will they present their research at the June International Conference of Machine Learning in Atlanta.
They will also demonstrate the robot at the Robotics: Science and Systems conference in Berlin, Germany, also in June.
VIDEO: Could this be the future? Robot learns how to pour you a beer 



 AND SOON NEWS OF ROBOT ATTACK ON HUMANS