Janne Lindqvist's recent work on smartphone interruptions has received a lot of attention in the popular media. The work done with his PhD students Fengpeng Yuan (CS) and Xianyi Gao (ECE) will be presented at CHI’17, the premier tier-1 publication venue for human-computer interaction.
The full paper is available here:
and a 30 second video preview here:
Prof. Lindqvist's work was highlighted on scientific websites such as the NSF, AAAS and ACM, has also received worldwide coverage in widely circulated articles. Below are a few examples:
"Tired of annoying phone alerts? New system could act as a 'secretary' to predict when you want to be left alone", in the Daily Mail
"Pardon the interruption: Here's how your smartphone could be less of a noodge", in Network World
"Smartphone notifications driving you crazy? This might help", in the Economic Times
Congratulations to Janne on the wide attention for his work!
Smartphone Interruptions: Are Yours Relentless and Annoying? A Rutgers study, featured in Rutgers Today, reveals that personality traits influence and help predict receptiveness to smartphone notifications."Ideally, a smartphone notification management system should be like an excellent human secretary who knows when you want to be interrupted or left alone", said Janne Lindqvist, an assistant professor in the Department of Electrical and Computer Engineering in Rutgers’ School of Engineering. “We know that people struggle with time management all the time, so a smartphone, instead of being a nuisance, could actually help with things.” Read the complete article here
Professor Vishal Patel's recent work on de-raining is featured in the publication The Outline. Along with his PhD students He Zhang and Vishwanath Sindagi, Professor Patel has recently developed an algorithm for de-raining rainy images using a deep learning method based on conditional generative adversarial networks. Please read the article entitled "Computers are learning how to see in the rain" at https://theoutline.com/post/979/scientists-remove-rain-and-snow-from-images-machine-learning.
Prof. Janne Lindqvist's research shows that how fast you drive might reveal exactly where you are going. Dr. Lindqvist's work on Elastic Pathing is featured in Rutgers Today
Prof. Lindqvist's Elastic Pathing research has been featured in many publications and news media including YouTube and in MIT Technology Review
Dr. Lindqvist's research is featured in the Communications of the ACM
and the front page of the IEEE Spectrum
Prof. Janne Lindqvist, an assistant professor of electrical and computer engineering, member of WINLAB and director of the Human-Computer Interaction Laboratory, led a team to show how just your driving speed can be used to track where you drive. This work, "Elastic Pathing: Your Speed is Enough to Track You" is part of a NSF-funded project for which Prof. Janne Lindqvist is the sole Principal Investigator.
Prof. Lindqvist's paper can be found at http://www.winlab.rutgers.edu/~janne/elasticpathing-ubicomp14.pdf and the YouTube video can be accessed by clicking below.
Prof. Janne Lindqvist's team included former PhD student Dr. Berhard Firner, with ECE PhD students Yulong Yang and Xianyi Gao, recently graduated Master's student Shridatt Sugrim and undergraduate student Victor Kaiser-Pendergast.
The motivation for the project was that today people increasingly have the opportunity to opt-in to "usage-based" automotive insurance programs for reducing insurance premiums. In these programs, participants install devices in their vehicles that monitor their driving behavior, which raises some privacy concerns. Some devices collect fine-grained speed data to monitor driving habits.
Prof. Janne Lindqvist's interdisciplinary security work which was presented at MobiSys'14, the tier-1 conference on mobile systems, is getting nice publicity around the world.
In that work, Prof. Janne Lindqvist and his students, Michael Sherman (former ECE undergrad, and currently WINLAB staff), Gradeigh Clark (ECE PhD student), Yulong Yang (ECE PhD student) and Shridatt Sugrim (ECE MS and WINLAB staff), collaborated with Prof. Antti Oulasvirta of Max-Planck Institute for Informatics and Teemu Roos of University of Helsinki, on a novel form of authentication for mobile devices. In particular, they studied user-generated free-form gestures and developed a novel information-theoretic for analyzing the security and memorability of the gestures. The group also built an actual authentication system designed for the gestures.
On October 6th, Dr. Lindqvist's research was featured on "Wake Up Rutgers", a daily Rutgers TV program. The show is available online at the link below. Dr. Lindqvist's segment starts at 23 minutes and ends at 26 minutes.
The research paper is available at:
A press release including a video made can be found at
The work has been featured on CBS Radio News and appeared in the following outlets:
Front page of NSF's 360 degrees web site at the moment:
International Business Times, which reaches 5 million people in the UK and 50 million around the world
Front page of Scientific Computing
2nd largest daily newspaper in the UK
A lot of coverage in India including major publishers including major publishers
Established in November 1981, it is the oldest and most widely circulatedEnglish-language broadsheet in Oman:
And lots more..
Prof. Janne Lindqvist's work on the Hazard Tracker app appeared in Engineering for Change website.
Yulong Yang, a PhD candidate in Prof. Lindqvist's group presented a paper on this work in the IEEE Global Humanitarian Technology Conference two weeks ago. The team also included Michael Sherman, and of course the volunteers from Warren Township.
The app for reporting hazards was used by eight volunteers in Warren Township, New Jersey, who reported 349 potential electrical hazards to the local government in a period of just eight days. Armed with the intel, the local utility company took action on 95 percent of the reports before the start of the 2014 hurricane season. The work was made possible through a smartphone application.
Prof. Lindqvist and his research team developed the “Hazard Tracker Application” to crowdsource hazard documentation. They found that using their app had three benefits: it was low cost, required little training and setup, and the data collected was highly portable and accessible to people who needed to work with it.
Prof. Janne Lindqvist's work on gesture based passwords was mentioned on NPR (see
and also many other news and media websites:
To protect your financial and personal data, most mobiles come with PIN-based security, biometrics or number grids that require you to retrace a particular pattern to access your device. But is that good enough in crowded places full of spying eyes?
Not necessarily, according to a team of researchers from Rutgers University in New Jersey, Max Planck Institute for Informatics and Saarland University in Germany, and the University of Helsinki in Finland. Thieves snagged about 3.1 million smartphones in the U.S. alone last year, according to a Consumer Reports study released in May. Most of those phones are not likely to be protected by screen locks—only about one third of mobile phone users surveyed use a four-digit PIN. And even passcode-protected phones are vulnerable to “shoulder surfing” thieves who can glean PINs by observing their victims using their devices in a crowded location before striking, according to the researchers.
As an alternative to PINs and passcodes, the researchers are studying the feasibility of touchscreen drawings, which they call “gestures.” In such a scenario, users would set their “password” by using one or more fingers to draw a line, curve or some other pattern on their touchscreens. The device would assign a value to the gesture. Users would have to replicate that same gesture on the screen—coming reasonably close to the assigned value—to later unlock the device.
“Once the user has come up with a repeatable gesture, it is really hard for others to do [the gesture] accurately because of your unique characteristics of your hand, muscles and joints,” says Janne Lindqvist, one of the project’s leaders and an assistant professor in Rutgers’ School of Engineering’s Department of Electrical and Computer Engineering. A “recognizer” program then identifies such a gesture as unique to that user.