Towards Characterizing Brain Communication Mechanisms Via Multi-Modal Imaging

Prof. Laleh Najafizadeh received a 4-year $323,134 grant from Siemens for the project "Towards Characterizing Brain Communication Mechanisms Via Multi-Modal Imaging".

The abstract is as follows:

Over the past two decades, studies of human brain function using functional brain imaging techniques, have let to the knowledge that human cognition arises from the contribution of large collections of brain networks which are continuously forming and dissolving at several spatial and temporal scales.

How, when and where these networks form, and how their communication/connection can be changed in time (through stimulation for example), are fundamental challenging questions, that if addressed, will potentially open up venues toward therapeutic development for cognitive improvement of several brain-related disorders. Functional brain imaging techniques, depending on the type of physical variable they measure, each present their own spatial and temporal resolutions.

Electroencephalography (EEG) offers high temporal resolution (ms), however, has a relatively modest spatial resolution (cm). Functional magnetic resonant imaging (fMRI), on the other hand, provides a great spatial resolution (mm), but it is sensitive to hemodynamic response, which is an indirect measure of neuronal activities. Combining high-resolution EEG and fMRI imaging techniques, offers the opportunity to study neurophysiological events related to brain activities in high temporal and spatial resolutions. The objective of this project is to study the communication mechanisms of brain networks on two different scales by combining multi-modal information from EEG and fMRI.

Middleware and High Performance Analytics Libraries for Scalable Data Science

Prof. Shantenu Jha received an NSF grant of $1.25M for the project "Middleware and High Performance Analytics Libraries for Scalable Data Science". This is part of a $5M grant led by Geoffrey Fox, Indiana University, for which Shantenu is the middleware architect.

The NSF press release about the project can be found at:
http://www.nsf.gov/new/news_summ.jsp?cntn_id=132880

and the abstract is given below.

Middleware and High Performance Analytics Libraries for Scalable Data Science

Many scientific problems depend on the ability to analyze and compute on large amounts of data. This analysis often does not scale well; its effectiveness is hampered by the increasing volume, variety and rate of change (velocity) of big data. This project will design, develop and implement building blocks that enable a fundamental improvement in the ability to support data intensive analysis on a broad range of cyberinfrastructure, including that supported by NSF for the scientific community. The project will integrate features of traditional high-performance computing, such as scientific libraries, communication and resource management middleware, with the rich set of capabilities found in the commercial Big Data ecosystem. The latter includes many important software systems such as Hadoop, available from the Apache open source community. A collaboration between university teams at Arizona, Emory, Indiana (lead), Kansas, Rutgers, Virginia Tech, and Utah provides the broad expertise needed to design and successfully execute the project. The project will engage scientists and educators with annual workshops and activities at discipline-specific meetings, both to gather requirements for and feedback on its software. It will include under-represented communities with summer experiences, and will develop curriculum modules that include demonstrations built as 'Data Analytics as a Service.'

The project will design and implement a software Middleware for Data-Intensive Analytics and Science (MIDAS) that will enable scalable applications with the performance of HPC (High Performance Computing) and the rich functionality of the commodity Apache Big Data Stack. Further, this project will design and implement a set of cross-cutting high-performance data-analysis libraries; SPIDAL (Scalable Parallel Interoperable Data Analytics Library) will support new programming and execution models for data-intensive analysis in a wide range of science and engineering applications. The project addresses major data challenges in seven different communities: Biomolecular Simulations, Network and Computational Social Science, Epidemiology, Computer Vision, Spatial Geographical Information Systems, Remote Sensing for Polar Science, and Pathology Informatics. The project libraries will have the same beneficial impact on data analytics that scientific libraries such as PETSc, MPI and Sc aLAPACK have had for supercomputer simulations. These libraries will be implemented to be scalable and interoperable across a range of computing systems including clouds, clusters and supercomputers.

Distributed Just-Ahead-Of-Time Verification of Cyber-Physical Critical Infrastructures

Prof. Saman Zonouz received a 3-year $579,486 NSF grant for the project “Distributed Just-Ahead-Of-Time Verification of Cyber-Physical Critical Infrastructures". The project abstract is shown below.

Trustworthy operation of next-generation complex power grid critical infrastructures requires mathematical and practical verification solutions to guarantee the correct infrastructural functionalities. This project develops the foundations of theoretical modeling, synthesis and real-world deployment of a formal and scalable controller code verifier for programmable logic controllers (PLCs) in cyber-physical settings. PLCs are widely used for control automation in industrial control systems. A PLC is typically connected to an engineering workstation where engineers develop the control logic to process the input values from sensors and issue control commands to actuators. The project focuses on protecting infrastructures against malicious control injection attacks on PLCs, such as Stuxnet, that inject malicious code on the device to drive the underlying physical platform to an unsafe state. The broader impact of this proposal is highly significant. It offers potential for real-time security for critical infrastructure systems covering sectors such as energy and manufacturing.

architecture-650x218.jpg

The project's intellectual merit is in providing a mathematical and practical verification framework for cyber-physical systems through integration of offline formal methods, online monitoring solutions, and power systems analysis. Offline formal methods do not scale for large-scale platforms due to their exhaustive safety analysis of all possible system states, while online monitoring often reports findings too late for preventative action. This project takes a hybrid approach that dynamically predicts the possible next security incidents and reports to operators before an unsafe state is encountered, allowing time for response. The broader impact of this project is in providing practical mathematical analysis capabilities for general cyber-physical safety-critical infrastructure with potential direct impact on our national security. The research outcomes are integrated into education modules for graduate, undergraduate, and K-12 classrooms.

NeTS: Small: Transmit Only: Green Communication for Dense Wireless Systems

Prof. Zhang and Prof. Mandayam received a $498K NSF grant on realizing green communications through transmit only networks.

Abstract: This proposal is targeted at realizing the vision of a ``green'' Internet of Things (IoT) and green communication using transmit-only devices. It is predicted that by 2020, there will be 50 billion embedded devices deployed in our ambient environment, most of which will be reporting data to the cloud through wireless communication. Thus, it is imperative to design green communication technologies in which power consumption is minimized and bandwidth utilization is optimized. Existing communication protocols, however, are not optimized for power consumption and bandwidth utilization because they were designed to facilitate reliable two-way exchange of information between communicating parties -- their requirements are completely different from those of IoT applications. The needs of emerging IoT applications, such as unidirectional communication flow, dense deployment, small packet sizes, are all opportunities for significantly simplifying network design. In this proposal, we propose to simplify the protocol design by removing receiving functions from the embedded devices, such that they only spend radio resources on sending application data, thus minimizing power consumption and maximizing bandwidth utilization.

In this project, we will study a set of algorithms that can achieve high throughput for wireless networks using transmit-only devices. The biggest challenge for a transmit-only network is the handling of packet collisions as the transmitters do not have any means of knowing whether others are transmitting at the same time. Transmit-only can be thought of as a single-input-multiple-output multiple access (SIMO-MAC) channel, but there is a fundamental difference between transmit-only and previous work in SIMO-MAC from the information theory community - in almost all of the previous studies, while transmitters do not communicate among themselves, they do rely on feedback from receivers to make transmission decisions related to encoding and/or scheduling. Transmit only, on the other hand, assumes once the network is deployed and in operation, each transmitter does not have any feedback from other transmitters or receivers. In this case, to reduce packet collisions, and to further ensure packet collisions do not lead to packet loss, this proposal proposes a set of strategies to pro-actively control the network topology as well as transmission schedules before network deployment. First, an optimal receiver placement strategy and a network dynamics based transmitter placement strategy are proposed to minimize the packet loss during a collision by exploiting the fact the stronger signal can be decoded at the receiver. Second, a transmission scheduling algorithm is proposed to overlap transmissions that can be decoded together (by different receivers) to minimize the collisions. The proposed scheduling algorithm also takes into consideration transmitter mobility to minimize their negative impact on the network throughput.

Collaborative Research: S2I2: Conceptualization of a Center for Biomolecular Simulation

Prof. Shantenu Jha received an NSF award for a project entitled "Collaborative Research:S2I2: Conceptualization of a Center for Biomolecular Simulation". The award is for $131,123 for 1 year and is a collaboration with Stanford, Berkeley, Rice and Rutgers.

This award will support the planning for a scientific software Institute in the area of Computational Chemistry. Molecular simulation is an integral part of contemporary chemistry due to its broad adoption by academic researchers and industries that use molecular mechanics and dynamics methodology to advance their science. The major molecular software programs have been downloaded by every major research university and biotech and pharmaceutical companies, and their wide usage is well exemplified by the ~30% of awarded cycles on the NSF XSEDE platforms.

However, development, testing, and validation of biomolecular simulation software, and the realization of high-throughput production runs made available on various hardware architectures, is something that the user community wants and requires, but is not something that has been adequately supported in a sustained way in the academic environment.

This award will examine and explore sustainable solutions to the development, deployment and uptake of software used for Biomolecular Simulations, in accordance with NSF's SI2 Software Strategy. The blueprint for the Institute will highlight multiple inter-disciplinary research problems and agenda; collectively, this will contribute to the training of the next-generation computational scientists and application-oriented cyberinfrastructure experts.

End-User Behavior and Prospect Pricing in Wireless Data Networks

Prof. Narayan Mandayam has received an NSF grant for his project : "End-User Behavior and Prospect Pricing in Wireless Data Networks”.   This is $500K 3-year grant and Arnold Glass of Psychology is a co-PI. The abstract of the grant is shown below.

Abstract: There is a recognition and push in both industry and academia towards the goal of achieving "1000x" capacity for wireless. The solution approaches range from spectrally agile cognitive radios with novel spectrum sharing, to use of higher frequency spectrum as well as smaller and denser cell deployments referred to as heterogeneous networks (HetNets). While this is a much needed activity with many challenges to overcome, providing a spatially high density of wireless/wired backhaul as required for HetNets is expensive and the overwhelming demands on wireless capacity fundamentally remain, in that state-of-the-art systems are nowhere near the 1000x capacity target goals and perhaps even an order of magnitude or two away. As a result, wireless service providers (SPs) in recent times have resorted to control access and services being provided to end-users via differentiated and hierarchical monetary pricing. A complementary approach termed “prospect pricing" is proposed as a way to support data demand and relies on influencing end-user (human) behavior using dynamic pricing algorithms when technological solutions by themselves cannot satisfy the demands of wireless data. When a SP controls access to end-users via differentiated and hierarchical monetary pricing, then the performance of the network is directly subject to end-user decision-making that has shown to deviate from expected utility theory (EUT). Prospect Theory, a Nobel prize winning theory that explains real-life decision-making and its deviations from EUT behavior is used to design “prospect pricing" for wireless networks. Specifically, dynamic pricing algorithms for wireless data are designed to enable HetNets to manage the ever increasing demand for data, especially when both spectrum and infrastructure resources are constrained. Using a mix of theory, algorithm development and experimentation with human subjects, the research agenda is carried out by a team comprised of a wireless networking/systems engineer and a cognitive psychologist.

A Multi-Layer Approach Towards Reliable Cognitive Radio Networks

Prof. Wade Trappe and Prof. Yanyong Zhang received a three-year NSF grant for the project “Collaborative Research: A Multi-Layer Approach Towards Reliable Cognitive Radio Networks”. The grant is a collaborative grant with Virginia Tech and the Rutgers portion is $285K.

Abstract:

The development of radio technologies that support efficient and reliable spectrum sharing is an enabler for utilizing the spectrum being made available through the National Broadband Plan. Software defined radios represent a promising technology that supports spectrum sharing as evidenced by the large amount of algorithms and protocols that allow for cognitive radio networks (CRNs) to be deployed. Unfortunately, the economic promise of dynamic spectrum access is easily undermined if cognitive radio users act dishonestly or maliciously, thereby subverting protocols that are founded on the cooperation of users. It is therefore important that mechanisms are developed that ensure the trustworthy operation of CRNs in the presence of potentially malicious or malfunctioning wireless nodes. The objective behind the project's research activities is to develop technological solutions that ensure that cognitive radios operate in trustworthy manner in spite of potential security threats. As a result of this research effort, it is possible for radio spectrum to be more reliably utilized, thereby ensuring that the economic opportunities associated with the radio spectrum are fairly utilized by everyone. The educational impact of the work comes from its multi-disciplinary foundation, broadening student views of wireless system design, and guiding the next generation of wireless engineer to include security and reliability in the design process.

Wireless technologies are an enabler for economic growth in the United States, and cognitive radio networks are an emerging form of wireless system that make spectrum access more available to the broader population. Unfortunately, cognitive radio systems are susceptible to threats that undermine the correct operation of their algorithms and protocols, and thus solutions that support the secure operation of cognitive radio networks are needed. This project ensures the trustworthy operation of cognitive radio networks by: 1) developing algorithms that ensure the correct operation of spectrum sensing procedures upon which spectrum access protocols rely; 2) developing traffic monitoring tools that identify improper communication activity by cognitive radio devices; and 3) developing new forms of interference-resistant communications that ensure that cognitive radio communication continues reliably in the face of interference. The research effort is inter-disciplinary, pulling from statistical tools to network traffic analysis to communications theory to support the secure operation of cognitive radio networks. The algorithms and protocols developed in this project are complemented by a systems prototyping and experimentation effort aimed at guaranteeing that the technologies developed are suitable for deployment in real wireless systems.

MatCam: A Camera that Sees Materials

Prof. Kristin Dana is awarded a 3 year NSF Grant for the project: MatCam: A Camera that Sees Materials. Rutgers is the lead institution on this 500K collaborative grant with K. Dana as the Rutgers PI.   Drexel University is the partner institution with PI Ko Nishino.

The proposed research program will create the first material camera or MatCam that outputs a per-pixel label of object material and its properties that can be used in any visual computing task. In the everyday real world there are a vast number of materials that are useful to discern including concrete, metal, plastic, velvet, satin, water layer on asphalt, carpet, tile, skin, hair, wood and marble. A camera device for identifying these materials has important implications in developing new algorithms and new technologies for a broad set of application domains including robotics, digital architecture, human-computer interaction, intelligent vehicles and advanced manufacturing.

Abstract:

This project develops the first material camera, or MatCam, that outputs a per-pixel label of object material and its properties that can be used in visual computing tasks. In the everyday real world there are a vast number of materials that are useful to discern including concrete, metal, plastic, velvet, satin, water layer on asphalt, carpet, tile, wood, and marble. A device for identifying materials has important implications in developing new technologies. For example, a mobile robot may use a MatCam to determine whether the terrain is grass, gravel, pavement, or snow in order to optimize mechanical control. In e-commerce, the material composition of objects can be tagged by a MatCam for advertising and inventory. The potential applications are limitless in areas such as robotics, digital architecture, human-computer interaction, intelligent vehicles and advanced manufacturing. Furthermore, material maps have foundational importance in nearly all vision algorithms including segmentation, feature matching, scene recognition, image-based rendering, context-based search, and object recognition and motion estimation. The camera brings material recognition to the broader scientific and engineering communities, in a similar way that depth cameras are currently used in many fields outside of computer vision.

This research brings high accuracy material estimation out of the lab and into the real-world for fast high-accuracy per-pixel material estimates. The program has three technical aims. First, a material appearance database is captured and stored with an exploration robot viewing surfaces from multiple angles. This large, structured and actionable visual dataset is then used to develop computational appearance models. A novel methodology using angular reflectance gradients is integrated for characterizing features of surface appearance. Using the training data and statistical inference methods, these models are designed for hardware implementation. The final aim is the material camera implementation as a near real-time prototype of point-and-shoot material acquisition that extends RGB-D cameras to RGB-DM cameras that provide color, depth, and material. The hardware implementation of the material appearance models utilizes FPGA and SoC (system-on-chip) technology.

Pages

Subscribe to Rutgers University, Electrical & Computer Engineering RSS