Tag Archives: Featured

Tony Chahine on human presence, reimagined | ApplySci @ Stanford

FacebooktwitterlinkedinFacebooktwitterlinkedin

Myant‘s Tony Chahine reimagined human presence at ApplySci’s recent Wearable Tech + Digital Health + Neurotech conference at Stanford:


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE JUNE 29TH

Proof of concept 3D printed cornea

FacebooktwitterlinkedinFacebooktwitterlinkedin

Newcastle University’s Che Connon has developed proof-of-concept research that could lead to a 3D printed cornea.

Stem cells  from a healthy donor cornea were mixed with alginate and collagen to create a printable bio-ink.  A 3D printer extruded the bio-ink in  concentric circles to form the shape of a human cornea in less then 10 minutes. The stem cells then grew.

According to Connon: “Our unique gel – a combination of alginate and collagen – keeps the stem cells alive whilst producing a material which is stiff enough to hold its shape but soft enough to be squeezed out the nozzle of a 3D printer. This builds upon our previous work in which we kept cells alive for weeks at room temperature within a similar hydrogel. Now we have a ready to use bio-ink containing stem cells allowing users to start printing tissues without having to worry about growing the cells separately.”

The team demonstrated that they could build a cornea to match a patient’s unique specifications, but said that it will be several years before this might be used for transplants.

Click to view Newcastle University video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 22nd

Thought, gesture-controlled robots

FacebooktwitterlinkedinFacebooktwitterlinkedin

MIT CSAIL’s Daniela Rus has developed an EEG/EMG robot control system based on brain signals and finger gestures.

Building on the team’s previous brain-controlled robot work, the new system detects, in real-time, if a person notices a robot’s error. Muscle activity measurement enables the use of hand gestures to select the correct option.

According to Rus: “This work, combining EEG and EMG feedback, enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback. By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

The researchers used  a humanoid robot from Rethink Robotics, while a human controller wore electrodes on her or his head and arm.

Human supervision  increased the choice of correct target from 70 to 97 per cent.

The goal is system that can be used for people with limited mobility or language disorders.

Click to view CSAIL video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 22nd

Phillip Alvelda: More intelligent; less artificial | ApplySci @ Stanford

FacebooktwitterlinkedinFacebooktwitterlinkedin

Phillip Alvelda discussed AI and the brain at ApplySci’s recent Wearable Tech + Digital Health + Neurotech Silicon Valley conference at Stanford:


Dr. Alvelda will join us again at Wearable Tech + Digital Health + Neurotech Boston, on September 24, 2018 at the MIT Media Lab.  Other speakers include: Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum Marom Bikson

REGISTRATION RATES INCREASE JUNE 22nd

Algorithm predicts low blood pressure during surgery

FacebooktwitterlinkedinFacebooktwitterlinkedin

UCLA’s Maxime Cannesson has developed an algorithm that, in a recent study, predicted  an intraoperative hypotensive event 15 minutes before it occurred in 84 percent of cases, 10 minutes before in 84 percent of cases, and five minutes before in 87 percent of cases.

The goal is early identification and treatment, to prevent complications, such as postoperative heart attack, acute kidney injury, or death.

The algorithm is based on recordings of the increase and decrease of blood pressure in the arteries during a heartbeat—including episodes of hypotension. For each heartbeat, the researchers were able to derive 3,022 individual features from the arterial pressure waveforms, producing more than 2.6 million bits of information. They then identified which of the features—when they happen together and at the same time—predict hypotension.

Cannesson said that the research “opens the door to the application of these techniques to many other physiological signals, such as EKG for cardiac arrhythmia prediction or EEG for brain function” and “could lead to a whole new field of investigation in clinical and physiological sciences and reshape our understanding of human physiology.”


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 15TH

Nano-robots remove bacteria, toxins from blood

FacebooktwitterlinkedinFacebooktwitterlinkedin

UCSD’s Joe Wang and Liangfang Zhang have developed tiny ultrasound-powered robots that can swim through blood, removing harmful bacteria and toxins.

Gold nanowires were coated with platelet and red blood cell membranes,  allowing the nanorobots to perform the tasks of two different cells at once—platelets, which bind pathogens, and red blood cells, which absorb and neutralize toxins. The gold body responds to ultrasound, giving the nanorobots the ability to swim rapidly without chemical fuel, helping them mix with blood bacteria and toxins and speed detoxification.

The robots are about 25 times smaller than the width of a human hair and can travel 35 micrometers per second in blood. They were tested on MRSA contaminated blood samples, which three times less bacteria and toxins than untreated samples after five minutes.

Click to view UCSD video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 15TH

Bob Knight on decoding language from direct brain recordings | ApplySci @ Stanford

FacebooktwitterlinkedinFacebooktwitterlinkedin

Berkeley’s Bob Knight discussed (and demonstrated) decoding language from direct brain recordings at ApplySci’s recent Wearable Tech + Digital Health + Neurotech Silicon Valley at Stanford:


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda

Join Apply Sci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22, 2019 at Stanford University

“Artificial nerve” system for sensory prosthetics, robots

FacebooktwitterlinkedinFacebooktwitterlinkedin

Stanford’s  Zhenan Bao has developed an artificial sensory nerve system that can activate the twitch reflex in a cockroach and identify letters in the Braille alphabet. Bao describes it as “a step toward making skin-like sensory neural networks for all sorts of applications”  which would include artificial skin that creates a sense of touch in prosthetics.

The artificial nerve circuit integrates three components:

  • A touch sensor that can detect minuscule forces.
  • A flexible electronic neuron which receives signals from the touch sensor.
  • An artificial synaptic transistor modeled after human synapses which is stimulated by theses sensory signals.

The system was successfully tested to generate both reflexes and a sense touch. The team  also hopes to create low-power, artificial sensor nets to cover robots, to provide feedback that makes them more agile.

Click to view Science video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda

Join Apply Sci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22, 2019 at Stanford University.  Zhenan Bao will be the keynote speaker.

Body heat-powered, self-repairing health sensor system

FacebooktwitterlinkedinFacebooktwitterlinkedin

Hossam Haick at Technion-Israel Institute of Technology has developed a body heat powered, self-repairing system of sensors for disease detection and monitoring.

Unlike other wearables, the ability to derive energy from the wearer,  and to fix tears and scratches, prevents the need to turn off the device for repair or charging, allowing truly continuous tracking.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum

Nathan Intrator on epilepsy, AI, and digital signal processing | ApplySci @ Stanford

FacebooktwitterlinkedinFacebooktwitterlinkedin

Nathan Intrator discussed epilepsy, AI and digital signal processing at ApplySci’s Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 26-27, 2018 at Stanford University:


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum