Two new initiatives have extended the line of technology contributions to the health field. One comes from the Massachusetts Institute of Technology - MIT and focuses on balancing care with privacy. The second one, from the Stanford University School of Engineering, seeks to propose more efficient ways to deal with the complexities of bedside treatments in hospitals. Called RF-Diary, the MIT project highlights that the use of cameras or even the observation of people has always been obstacles to elderly care or sick people in domestic environments. The proposal is creating a textual description - or caption - of activities and interactions of people at home with the help of radio signals. With this, family members can receive updates on the elderly daily life - for example, if they had lunch or the proper hygiene. A commonly used method is deploying cameras. However, it often raises privacy concerns, mainly in bedrooms and bathrooms. In addition, cameras have a limited field of view and, therefore, you need to deploy several of them to cover different environments, and do not work well in low light, which is common at home and at night. To overcome these limitations, MIT experts propose the use of radiofrequency (RF) signals, which provide many benefits, according to them. One is being able to preserve privacy a little bit more in relation to cameras, as the signals are difficult to be interpreted by humans. In addition, signals can pass through walls and obstacles and cover most home environments. They work well in low- or high-light situations with no performance degradation as well. And, according to MIT's forecast, it seems that it will be possible to analyze the radio signals that bounce off the bodies to capture people's movements. However, MIT researchers warn that the use of RF signals also introduces new challenges, such as not being able to differentiate some objects. Another issue is having a data set with the RF signals from people's homes and the corresponding captions. "Training" captions in a systematic way may require tens of thousands of samples. And collecting a set of data from people's homes may be a difficult task as well. Better understanding of spaces Aiming to help mitigate clinical errors and improve treatments in hospitals and at home, the Stanford University School of Engineering has developed a smart room design combining advances in machine learning and electronic sensors to improve understanding of so-called unobserved spaces in the health care field. The project is being led by Arnold Milstein, professor of medicine and director of the Stanford Center for Clinical Excellence Research, computer science professor Fei-Fei Li, and graduate student Albert Haque. In an article for the Nature journal, they comment that technology has already helped doctors with their diagnostic and therapeutic decisions. However, there are still few technological solutions that help doctors, nurses, patients, and family members in their treatment and health care routines. With this interdisciplinary effort, they intend to create smart hospital rooms equipped with Artificial Intelligence (AI) systems to help improve treatment results and mitigate clinical errors. For example, these smart spaces can help intensive care units and operating rooms to conduct workflows more efficiently and safely. Doctors, nurses, and assistants would be alerted by electronic sensors to sanitize their hands. Even in domestic environments, smart rooms could prolong the elderly autonomy and facilitate the monitoring of bedridden people or those with chronic diseases in search of evidence of atypical behaviors. This Standford project is based mainly on two technological areas: infrared sensors and machine learning to train applications used in the healthcare field. Active infrared systems applied in this project using AI to calculate how long the rays take to return to the source and thus map the people or object contours. The second type of passive infrared technology, used for night vision based on body heat, may make it possible to detect contractions or contortions under sheets and alert the clinical team to impending problems. So far, the project has avoided the use of high-definition videos, such as those on smartphones, as capturing images could unnecessarily interfere with the doctor and patient privacy. In addition to privacy, Standford researchers comment that, as with other classes of technologies, smart environments in hospitals will also face challenges in other areas, such as rigorous clinical validation and transparency of the model. In the field of machine learning applied to medical devices only, a report published by the PHG Foundation, a nonprofit health policy think tank affiliated with the University of Cambridge, describes three major challenges associated with digital health regulation in the United States and the United Kingdom.