Mobile WellBeing

mobile digital devices in service of human wellbeing

  • Subscribe

  • Categories

  • Recent Posts

MHealth and thought control.

Posted by Ron Otten on 12/10/2009

First came the joystick. Then came the motion-sensing Wii remote. What´s next? Sensors and mobiles are opening up a new world: thought control.

Co-founded by Allan Snyder, a neuroscientist and former University of Cambridge research fellow, Emotiv says its EPOC headset features 16 sensors that push against the player’s scalp to measure electrical activity in the brain – a process known as electro-encephalography. In theory, this allows the player to spin, push, pull, and lift objects on a computer monitor, simply by thinking. “There will be a convergence of gesture-based technology and the brain as a new interface – the Holy Grail is the mind” says Snyder.

Last month the Defence Advanced Research Projects Agency (Darpa), an arm of the US Defence Department, said it had awarded a $6.7 million contract to Northrop Grumman to develop “brainwave binoculars”. The binoculars use scalp-mounted sensors to detect objects the user might have seen but not noticed – in other words, the computer is used as a kind of brain-aid, giving the user superhuman vision.

Explaining the technology, Dr Robert Shin, an assistant professor of neurology and ophthalmology at the University of Maryland School of Medicine, said: “There is a level where the brain can identify things before it ever makes it to the conscious level. Your brain says, ‘it may be something’, but it might not realize that it is something that should rise to the conscious level.”

Another defence contractor, Honeywell, has been working on a similar technology known as “augmented cognition” to help intelligence analysts to operate more effectively. Based on the same principle as the binoculars, it has been shown to make analysts work up to seven times faster. It can also detect when they are getting tired. In other tests, soldiers have been kitted out with headsets that detect “brain overload”, allowing commanders to know if they can process new information under the extreme pressures of the battlefield.

Posted in controling, data, sharing | Tagged: , , , , , | Leave a Comment »

mHealth and Motion capturing.

Posted by Ron Otten on 07/10/2009

Motion capture, or Mocap, is a technique for digitally recording movement. Are we playing games her? Originally used as an analysis tool for biomechanics, mocap is now successfully employed in a wide variety of sectors including mHealth related applications.

Movement is captured through the placement of sensors (or markers) on or near each joint of the body. As each joint moves the positions or angles between the markers are recorded. Software records the, angles, velocities, accelerations and impulses, providing an accurate digital representation of the movement.

Realtime data from mocap enables the diagnosis of problems or enhancement of performance in the arenas of biomechanics and sports. It can also assist in the design of products or buildings when applied to the field of engineering or ergonomics. Animazoo distinguishes three types of Mocap´s.

Gyroscopic systems use tiny inertial gyroscopes that are attached to a body. These directly record the rotations of the body parts. The rotational data is transmitted by radio to a receiver unit where it is mapped instantly to a skeleton in order that the data can be visualized in realtime. These systems perform with no lag in realtime, producing incredibly accurate data. The data retains nuance even with fast moves.

Mechanical systems track body joint angles directly and are often referred to as exo-skeleton mocap systems, due to the way the sensors are attached to the body. A person attaches the skeletal-like structure to their body and as they move so do the articulated mechanical parts, measuring the performer’s relative motion. Mechanical motion capture systems are realtime, relatively low-cost and usually wireless. Movement is captured through the placement of sensors (or markers) on or near each joint of the body. As each joint moves the positions or angles between the markers are recorded. Software records the, angles, velocities, accelerations and impulses, providing an accurate digital representation of the movement.

Optical systems triangulate the 3D position of a marker between one, two or more cameras that have been pre-calibrated for distance to provide overlapping projections. Tracking a large number of markers or multiple performers is accomplished by the adding more cameras. These systems can be expensive to buy, require technical expertise to operate and are studio based. They have a relatively small capture area and can suffer from occlusion as well as being complicated to set up. Magnetic and electrical interference makes these systems highly susceptible to error, they also require extensive data cleaning and technical expertise to operate plus they suffer from limited area of use and lag for realtime use.

Magnetic systems calculate position and orientation by measuring the relative magnetic flux of three orthogonal coils on both the transmitter and each receiver. Magnetic systems require only two-thirds the number of markers compared to optical systems. One drawback is that the markers are susceptible to magnetic and electrical interference from metal objects in the environment and electrical sources. Magnetic and electrical interference makes these systems highly susceptible to error, they also require extensive data cleaning and technical expertise to operate plus they suffer from limited area of use and lag for realtime use.

Posted in controling, data, sharing | Tagged: , , , , , , , | Leave a Comment »

Basic components for building mHealth devices.

Posted by Ron Otten on 28/09/2009

One step beyond the platform is adding other components. What do you create when your motto is “Computing stuff tied to the physical world?”  A tiny, fairly well featured kit with wireless capability. The JeeNode wireless communication platform.

It looks like a fun and cost effective way to get into experimenting with RF communication. By combining an Arduino-compatible processor (ATmega328) with a low-cost HopeRF radio module, Jean-Claude Wippler in a town called Houten, The Netherlands,  creates these building blocks and offering them for sale as a kit, or, since it is an open source hardware design, you can just download the PCB layout and roll your own. You can think of lots of applications (remote candle lighter, interactive cat toy:)) that aren’t worth a full xBee-based solution, where it would be handy to have a development board like this that I could just drop in and use.

Jee Labs also has a weblog with daily news about projects being worked on in the fascinating world of physical computing, wireless comm’s, sensors, lights, switches, motors, robots, WSN’s, Arduino’s, you name it.

Posted in acting | Tagged: , , , , , | 1 Comment »

Building a sensornetwork for mHealth purposes.

Posted by Ron Otten on 25/09/2009

For a wireless sensor network you need a platform to start with. But what? Arduino is an open-source electronics prototyping platform based on flexible, easy-to-use hardware and software. It’s intended for artists, designers, hobbyists, and anyone interested in creating interactive objects or environments.

Arduino can sense the environment by receiving input from a variety of sensors and can affect its surroundings by controlling lights, motors, and other actuators. The microcontroller on the board is programmed using the Arduino programming language (based on Wiring) and the Arduino development environment (based on Processing). Arduino projects can be stand-alone or they can communicate with software on running on a computer (e.g. Flash, Processing, MaxMSP).

The boards can be built by hand or purchased preassembled. The software can be downloaded for free. The hardware reference designs (CAD files) are available under an open-source license, you are free to adapt them to your needs.

Posted in acting | Tagged: , , , , | Leave a Comment »

Wireless Sensor Networks and mHealth basics 3.

Posted by Ron Otten on 24/09/2009

Last theory on Wireless Sensor Networks coming up. What about the software, middleware and programming languages?

Software

Energy is the scarcest resource of WSN nodes, and it determines the lifetime of WSNs. WSNs are meant to be deployed in large numbers in various environments, including remote and hostile regions, with ad-hoc communications as key. For this reason, algorithms and protocols need to address the following issues:

  • Lifetime maximization
  • Robustness and fault tolerance
  • Self-configuration

Middleware

There is considerable research effort currently invested in the design of middleware for WSN’s. In general approaches can be classified into distributed database, mobile agents, and event-based.

Programming languages

Programming the sensor nodes is difficult when compared with normal computer systems. The resource constrained nature of these nodes gives rise to new programming models although most nodes are currently programmed in C.

Posted in acting, data, sharing | Tagged: , , , , , | Leave a Comment »

Wireless Sensor Networks and mHealth basics 2.

Posted by Ron Otten on 23/09/2009

What standards, hardware and operating systems are  used for wireless sensor networks? There are three. I wrote some articles about ZigBee. It´s a proprietary mesh-networking specification intended for uses such as embedded sensing, medical data collection and home automation. WirelessHART is specifically designed for Industrial applications. 6LoWPAN is the IETF standards track specification. Also relevant to sensor networks is the emerging IEEE 1451 which attempts to create standards for the smart sensor market. The main point of smart sensors is to move the processing intelligence closer to the sensing device.

Hardware

The main challenge is to produce low cost and tiny sensor nodes. With respect to these objectives, current sensor nodes are mainly prototypes. Miniaturization and low cost are understood to follow from recent and future progress. Some of the existing sensor nodes are given below. Some of the nodes are still in research stage. Also inherent to sensor network adoption is the availability of a very low power method for acquiring sensor data wirelessly.

Operating systems

Operating systems for wireless sensor network nodes are typically less complex than general-purpose operating systems both because of the special requirements of sensor network applications and because of the resource constraints in sensor network hardware platforms. Wireless sensor network hardware is not different from traditional embedded systems and it is therefore possible to use embedded operating systems such as eCos or uC/OS for sensor networks. However, such operating systems are often designed with real-time properties. Unlike traditional embedded operating systems, however, operating systems specifically targeting sensor networks often do not have real-time support.

TinyOS is perhaps the first operating system specifically designed for wireless sensor networks. Unlike most other operating systems, TinyOS is based on an event-driven programming model instead of multithreading. TinyOS programs are composed into event handlers and tasks with run to completion-semantics. When an external event occurs, such as an incoming data packet or a sensor reading, TinyOS calls the appropriate event handler to handle the event. Event handlers can post tasks that are scheduled by the TinyOS kernel some time later. Both the TinyOS system and programs written for TinyOS are written in a special programming language called nesC which is an extension to the C programming language.

There are also operating systems that allow programming in C. Examples of such operating systems include Contiki, MANTIS, BTnut, SOS and Nano-RK. LiteOS is a newly developed OS for wireless sensor networks, which provides UNIX like abstraction and support for C programming language.

Posted in acting, data, sharing | Tagged: , , , , | Leave a Comment »

Wireless Sensor Networks and mHealth basics 1.

Posted by Ron Otten on 22/09/2009

Building a Wireless Sensor Network is fine, but what are the unique characteristics of such a network:

  • Limited power they can harvest or store
  • Ability to withstand harsh environmental conditions
  • Ability to cope with node failures
  • Mobility of nodes
  • Dynamic network topology
  • Communication failures
  • Heterogeneity of nodes
  • Large scale of deployment
  • Unattended operation
  • Node capacity is scalable,only limited by bandwidth of gateway node.

Sensor nodes can be imagined as small computers, extremely basic in terms of their interfaces and their components. They usually consist of a processing unit with limited computational power and limited memory, sensors (including specific conditioning circuitry), a communication device (usually radio transceivers or alternatively optical), and a power source usually in the form of a battery. Other possible inclusions are energy harvesting modules, secondary ASICs, and possibly secondary communication devices (e.g. RS-232 or USB).

The base stations are one or more distinguished components of the WSN with much more computational, energy and communication resources. They act as a gateway between sensor nodes and the end user.

Posted in data, sharing | Tagged: , , , | Leave a Comment »

MHealth secures hygiene in hospitals.

Posted by Ron Otten on 21/09/2009

Experts say nearly 2 million hospital-acquired infections occur each year, resulting in about 5,000 deaths and more than 90,000 illnesses in the US. Research shows that simple hand washing by medical staff could cut the number of infections in half. But what if your rushing to the next patient? There is now a wireless, credit-card-sized sensor that can detect whether health care workers have properly washed their hands upon entering a patient’s room.

The Virginia Commonwealth University Medical Center was chosen as a study site because of its higher-than-average rate of hand hygiene compliance, nearly twice the national average. The sensor is worn like a name badge and is programmed to detect the presence of ethyl alcohol, the most common ingredient in hand cleansing solutions used in hospitals.
When a health care worker enters a patient’s room, a small, wall-mounted sensor sends a signal to the badge to check for the presence of alcohol. The worker places their hands near the badge to obtain a reading. Lights on the badge glow red if no alcohol is present, indicating the need to wash hands. A green light indicates alcohol is present.

“Health care workers don’t deliberately avoid washing their hands; they get distracted or are so busy moving from one thing to the next they don’t remember to do it,” said Mike Edmond, M.D., chief hospital epidemiologist. “Until now, the only way we’ve been able to track hand washing habits is through direct observation. This new system continuously monitors and records data and serves as a constant reminder.”

The hand hygiene program is part of an aggressive environmental and patient safety campaign at the VCU Medical Center called Safety First, Every Day. The goal of the campaign is to make the medical center the safest health care institution in the country with no events of preventable harm to patients, employees and visitors. The device was developed by BioVigil, LLC.

Posted in controling, data, sharing | Tagged: , , , , , | Leave a Comment »

MHealth muscles tests more accurate.

Posted by Ron Otten on 13/07/2009

Doctors test the strenght of intrinsic hand muscles by letting the patient pull an push at their hand and fingers. Is this an accurate methode? No, a team of bioengineering students from Rice University developed a device to measure thenar, hypothenar, interosseus and lumbrical muscles.

Graduates Caterina Kaffes, Matthew Miller, Neel Shah and Shuai “Steve” Xu invented PRIME, or Peg Restrained Intrinsic Muscle Evaluator, for their senior project. “Twenty percent of all ER admissions are hand-related. Neuromuscular disorders like spinal cord injuries, Lou Gehrig’s, diabetes, multiple sclerosis-all these diseases affect the intrinsic hand muscles,” said Xu. PRIME, was created to replace the common test. The real goal is to quantify finger/muscle strength for a more accurate diagnosis for carpal tunnel syndrome evaluation and other disorders.

“U.S. surgeons perform over 500,000 procedures for carpal tunnel each year. $2 billion per year is spent treating this disease but up to 20 percent of all surgeries need to be redone. Our invention can be used across the spectrum of care from diagnosis to outcome measurements,” said Xu.

The device has three elements: a pegboard restraint, a force transducer enclosure and a PDA custom-programmed to capture measurements. In a five-minute test, a doctor uses pegs to isolate a patient’s individual fingers. “You wouldn’t think it works as well as it does, but once you are pegged in, you can’t move anything but the finger we want you to,” Miller said. A loop is fitted around the finger, and when the patient moves it, the amount of force generated is measured. “PRIME gets the peak force,” Xu said. “Then the doctor can create a patient-specific file with all your information, time-stamped, and record every single measurement.”

Posted in controling, data, sharing | Tagged: , , , , , | 1 Comment »

U.S. Rules for MBAN’s validate potential of mHealth.

Posted by Ron Otten on 10/07/2009

The Federal Communications Commission (FCC) has proposed to allocate radiofrequency spectrum and establish service and technical rules for the operation of Medical Body Area Network (MBAN) systems. Why is the FCC interested in this area? They envision that MBANs would provide a platform for the wireless networking of multiple body sensors used for monitoring a patient’s physiological data, primarily in health care facilities.

MBAN’s could be used to monitor an array of physiological data, such as temperature, pulse, blood glucose level, blood pressure, respiratory function and a variety of other physiological metrics. MBAN systems would primarily be used in health care facilities, with the potential also of being used in other patient care/monitoring circumstances. Unlike traditional medical telemetry systems which rely on separate uncoordinated links for each physiological function being monitored, MBAN systems could serve to wirelessly monitor all of the desired data of a single patient, which could then be aggregated and wirelessly transmitted to a remote location for evaluation.

Using MBAN systems to eliminate much of the wired cables that typically connect patients to monitoring equipment and to facilitate the aggregation and transfer of physiological data will offer several clinical benefits, including improved patient mobility and comfort, reduced risks of infection, reduced clinical errors, and reduced patient monitoring costs.

The Notice of the FCC seeks comment on options for accommodating MBAN operations in several frequency bands, and on the amount of spectrum that should be allocated for such use. More specifically, the Notice seeks comment on the feasibility of using the 2360-2400 MHz; 2300-2305 MHz and 2395-2400 MHz; the 2400-2483.5 MHz; or 5150-5250 MHz bands for this purpose, and on various licensing schemes that would be appropriate for any of these bands under consideration. In addition, the Notice seeks comment on tentative service and eligibility rules that would be similar in many respects to those for other wireless body-worn and implanted medical devices operating in the MedRadio Service in the 401-406 MHz bands.

This action by the Commission is by Notice of Proposed Rule Making (FCC 09-57).

Reblog this post [with Zemanta]

Posted in communicating, data, sharing | Tagged: , , , , , | Leave a Comment »