Search results for: hardware impairment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 960

Search results for: hardware impairment

120 35 MHz Coherent Plane Wave Compounding High Frequency Ultrasound Imaging

Authors: Chih-Chung Huang, Po-Hsun Peng

Abstract:

Ultrasound transient elastography has become a valuable tool for many clinical diagnoses, such as liver diseases and breast cancer. The pathological tissue can be distinguished by elastography due to its stiffness is different from surrounding normal tissues. An ultrafast frame rate of ultrasound imaging is needed for transient elastography modality. The elastography obtained in the ultrafast system suffers from a low quality for resolution, and affects the robustness of the transient elastography. In order to overcome these problems, a coherent plane wave compounding technique has been proposed for conventional ultrasound system which the operating frequency is around 3-15 MHz. The purpose of this study is to develop a novel beamforming technique for high frequency ultrasound coherent plane-wave compounding imaging and the simulated results will provide the standards for hardware developments. Plane-wave compounding imaging produces a series of low-resolution images, which fires whole elements of an array transducer in one shot with different inclination angles and receives the echoes by conventional beamforming, and compounds them coherently. Simulations of plane-wave compounding image and focused transmit image were performed using Field II. All images were produced by point spread functions (PSFs) and cyst phantoms with a 64-element linear array working at 35MHz center frequency, 55% bandwidth, and pitch of 0.05 mm. The F number is 1.55 in all the simulations. The simulated results of PSFs and cyst phantom which were obtained using single, 17, 43 angles plane wave transmission (angle of each plane wave is separated by 0.75 degree), and focused transmission. The resolution and contrast of image were improved with the number of angles of firing plane wave. The lateral resolutions for different methods were measured by -10 dB lateral beam width. Comparison of the plane-wave compounding image and focused transmit image, both images exhibited the same lateral resolution of 70 um as 37 angles were performed. The lateral resolution can reach 55 um as the plane-wave was compounded 47 angles. All the results show the potential of using high-frequency plane-wave compound imaging for realizing the elastic properties of the microstructure tissue, such as eye, skin and vessel walls in the future.

Keywords: plane wave imaging, high frequency ultrasound, elastography, beamforming

Procedia PDF Downloads 518
119 Utilizing Literature Review and Shared Decision-Making to Support a Patient Make the Decision: A Case Study of Virtual Reality for Postoperative Pain

Authors: Pei-Ru Yang, Yu-Chen Lin, Jia-Min Wu

Abstract:

Background: A 58-year-old man with a history of osteoporosis and diabetes presented with chronic pain in his left knee due to severe knee joint degeneration. The knee replacement surgery was recommended by the doctor. But the patient suffered from low pain tolerance and wondered if virtual reality could relieve acute postoperative wound pain. Methods: We used the PICO (patient, intervention, comparison, and outcome) approach to generate indexed keywords and searched systematic review articles from 2017 to 2021 on the Cochran Library, PubMed, and Clinical Key databases. Results: The initial literature results included 38 articles, including 12 Cochrane library articles and 26 PubMed articles. One article was selected for further analysis after removing duplicates and off-topic articles. The eight trials included in this article were published between 2013 and 2019 and recruited a total of 723 participants. The studies, conducted in India, Lebanon, Iran, South Korea, Spain, and China, included adults who underwent hemorrhoidectomy, dental surgery, craniotomy or spine surgery, episiotomy repair, and knee surgery, with a mean age (24.1 ± 4.1 to 73.3 ± 6.5). Virtual reality is an emerging non-drug postoperative analgesia method. The findings showed that pain control was reduced by a mean of 1.48 points (95% CI: -2.02 to -0.95, p-value < 0.0001) in minor surgery and 0.32 points in major surgery (95% CI: -0.53 to -0.11, p-value < 0.03), and the overall postoperative satisfaction has improved. Discussion: Postoperative pain is a common clinical problem in surgical patients. Research has confirmed that virtual reality can create an immersive interactive environment, communicate with patients, and effectively relieve postoperative pain. However, virtual reality requires the purchase of hardware and software and other related computer equipment, and its high cost is a disadvantage. We selected the best literature based on clinical questions to answer the patient's question and used share decision making (SDM) to help the patient make decisions based on the clinical situation after knee replacement surgery to improve the quality of patient-centered care.

Keywords: knee replacement surgery, postoperative pain, share decision making, virtual reality

Procedia PDF Downloads 54
118 Spatial Pattern of Environmental Noise Levels and Auditory Ailments in Abeokuta Metropolis, Southwestern Nigeria

Authors: Olusegun Oguntoke, Aramide Y. Tijani, Olayide R. Adetunji

Abstract:

Environmental noise has become a major threat to the quality of human life, and it is generally more severe in cities. This study assessed the level of environmental noise, mapped the spatial pattern at different times of the day and examined the association with morbidity of auditory ailments in Abeokuta metropolis. The entire metropolis was divided into 80 cells (areas) of 1000 m by 1000 m; out of which 33 were randomly selected for noise levels assessment. Portable noise meter (AR824) was used to measure noise level, and Global Positioning System (Garmin GPS-72H) was employed to take the coordinates of the sample sites for mapping. Risk map of the noise levels was produced using Kriging interpolation techniques based on the spatial spread of measured noise values across the study area. Data on cases of hearing impairments were collected from four major hospitals in the city. Data collected from field measurements and medical records were subjected to descriptive (frequency and percentage) and inferential (mean, ANOVA and correlation) statistics using SPSS (version 20.0). ArcMap 10.1 was employed for spatial analysis and mapping. Results showed mean noise levels range at morning (42.4 ± 4.14 – 88.2 ± 15.1 dBA), afternoon (45.0 ± 6.72– 86.4 ± 12.5 dBA) and evening (51.0 ± 6.55–84.4 ± 5.19 dBA) across the study area. The interpolated maps identified Kuto, Okelowo, Isale-Igbein, and Sapon as high noise risk areas. These are the central business district and nucleus of Abeokuta metropolis where commercial activities, high traffic volume, and clustered buildings exist. The monitored noise levels varied significantly among the sampled areas in the morning, afternoon and evening (p < 0.05). A significant correlation was found between diagnosed cases of auditory ailments and noise levels measured in the morning (r=0.39 at p < 0.05). Common auditory ailments found across the metropolis included impaired hearing (25.8%), tinnitus (16.4%) and otitis (15.0%). The most affected age groups were between 11-30 years while the male gender had more cases of hearing impairments (51.2%) than the females. The study revealed that environmental noise levels exceeded the recommended standards in the morning, afternoon and evening in 60.6%, 61% and 72.7% of the sampled areas respectively. Summarily, environmental noise in the study area is high and contributes to the morbidity of auditory ailments. Areas identified as hot spots of noise pollution should be avoided in the location of noise sensitive activities while environmental noise monitoring should be included as part of the mandate of the regulatory agencies in Nigeria.

Keywords: noise pollution, associative analysis, auditory impairment, urban, human exposure

Procedia PDF Downloads 128
117 Inertial Motion Capture System for Biomechanical Analysis in Rehabilitation and Sports

Authors: Mario Sandro F. Rocha, Carlos S. Ande, Anderson A. Oliveira, Felipe M. Bersotti, Lucas O. Venzel

Abstract:

The inertial motion capture systems (mocap) are among the most suitable tools for quantitative clinical analysis in rehabilitation and sports medicine. The inertial measuring units (IMUs), composed by accelerometers, gyroscopes, and magnetometers, are able to measure spatial orientations and calculate displacements with sufficient precision for applications in biomechanical analysis of movement. Furthermore, this type of system is relatively affordable and has the advantages of portability and independence from external references. In this work, we present the last version of our inertial motion capture system, based on the foregoing technology, with a unity interface designed for rehabilitation and sports. In our hardware architecture, only one serial port is required. First, the board client must be connected to the computer by a USB cable. Next, an available serial port is configured and opened to establish the communication between the client and the application, and then the client starts scanning for the active MOCAP_S servers around. The servers play the role of the inertial measuring units that capture the movements of the body and send the data to the client, which in turn create a package composed by the ID of the server, the current timestamp, and the motion capture data defined in the client pre-configuration of the capture session. In the current version, we can measure the game rotation vector (grv) and linear acceleration (lacc), and we also have a step detector that can be abled or disabled. The grv data are processed and directly linked to the bones of the 3D model, and, along with the data of lacc and step detector, they are also used to perform the calculations of displacements and other variables shown on the graphical user interface. Our user interface was designed to calculate and present variables that are important for rehabilitation and sports, such as cadence, speed, total gait cycle, gait cycle length, obliquity and rotation, and center of gravity displacement. Our goal is to present a low-cost portable and wearable system with a friendly interface for application in biomechanics and sports, which also performs as a product of high precision and low consumption of energy.

Keywords: biomechanics, inertial sensors, motion capture, rehabilitation

Procedia PDF Downloads 131
116 Evaluation of Cooperative Hand Movement Capacity in Stroke Patients Using the Cooperative Activity Stroke Assessment

Authors: F. A. Thomas, M. Schrafl-Altermatt, R. Treier, S. Kaufmann

Abstract:

Stroke is the main cause of adult disability. Especially upper limb function is affected in most patients. Recently, cooperative hand movements have been shown to be a promising type of upper limb training in stroke rehabilitation. In these movements, which are frequently found in activities of daily living (e.g. opening a bottle, winding up a blind), the force of one upper limb has to be equally counteracted by the other limb to successfully accomplish a task. The use of standardized and reliable clinical assessments is essential to evaluate the efficacy of therapy and the functional outcome of a patient. Many assessments for upper limb function or impairment are available. However, the evaluation of cooperative hand movement tasks are rarely included in those. Thus, the aim of this study was (i) to develop a novel clinical assessment (CASA - Cooperative Activity Stroke Assessment) for the evaluation of patients’ capacity to perform cooperative hand movements and (ii) to test its inter- and interrater reliability. Furthermore, CASA scores were compared to current gold standard assessments for upper extremity in stroke patients (i.e. Fugl-Meyer Assessment, Box & Blocks Test). The CASA consists of five cooperative activities of daily living including (1) opening a jar, (2) opening a bottle, (3) open and closing of a zip, (4) unscrew a nut and (5) opening a clipbox. Here, the goal is to accomplish the tasks as fast as possible. In addition to the quantitative rating (i.e. time) which is converted to a 7-point scale, also the quality of the movement is rated in a 4-point scale. To test the reliability of CASA, fifteen stroke subjects were tested within a week twice by the same two raters. Intra-and interrater reliability was calculated using the intraclass correlation coefficient (ICC) for total CASA score and single items. Furthermore, Pearson-correlation was used to compare the CASA scores to the scores of Fugl-Meyer upper limb assessment and the box and blocks test, which were assessed in every patient additionally to the CASA. ICC scores of the total CASA score indicated an excellent- and single items established a good to excellent inter- and interrater reliability. Furthermore, the CASA score was significantly correlated to the Fugl-Meyer and Box & Blocks score. The CASA provides a reliable assessment for cooperative hand movements which are crucial for many activities of daily living. Due to its non-costly setup, easy and fast implementation, we suggest it to be well suitable for clinical application. In conclusion, the CASA is a useful tool in assessing the functional status and therapy related recovery in cooperative hand movement capacity in stroke patients.

Keywords: activitites of daily living, clinical assessment, cooperative hand movements, reliability, stroke

Procedia PDF Downloads 308
115 Validating the Cerebral Palsy Quality of Life for Children (CPQOL-Child) Questionnaire for Use in Sri Lanka

Authors: Shyamani Hettiarachchi, Gopi Kitnasamy

Abstract:

Background: The potentially high level of physical need and dependency experienced by children with cerebral palsy could affect the quality of life (QOL) of the child, the caregiver and his/her family. Poor QOL in children with cerebral palsy is associated with the parent-child relationship, limited opportunities for social participation, limited access to healthcare services, psychological well-being and the child's physical functioning. Given that children experiencing disabilities have little access to remedial support with an inequitable service across districts in Sri Lanka, and given the impact of culture and societal stigma, there may be differing viewpoints across respondents. Objectives: The aim of this study was to evaluate the psychometric properties of the Tamil version of the Cerebral Palsy Quality of Life for Children (CPQOL-Child) Questionnaire. Design: An instrument development and validation study. Methods: Forward and backward translations of the CPQOL-Child were undertaken by a team comprised of a physiotherapist, speech and language therapist and two linguists for the primary caregiver form and the child self-report form. As part of a pilot phase, the Tamil version of the CPQOL was completed by 45 primary caregivers with children with cerebral palsy and 15 children with cerebral palsy (GMFCS level 3-4). In addition, the primary caregivers commented on the process of filling in the questionnaire. The psychometric properties of test-retest reliability, internal consistency and construct validity were undertaken. Results: The test-retest reliability and internal consistency were high. A significant association (p < 0.001) was found between limited motor skills and poor QOL. The Cronbach's alpha for the whole questionnaire was at 0.95.Similarities and divergences were found between the two groups of respondents. The child respondents identified limited motor skills as associated with physical well-being and autonomy. Akin to this, the primary caregivers associated the severity of motor function with limitations of physical well-being and autonomy. The trend observed was that QOL was not related to the level of impairment but connected to environmental factors by the child respondents. In addition to this, the main concern among primary caregivers about the child's future and on the child's lack of independence was not fully captured by the QOL questionnaire employed. Conclusions: Although the initial results of the CPQOL questionnaire show high test-retest reliability and internal consistency of the instrument, it does not fully reflect the socio-cultural realities and primary concerns of the caregivers. The current findings highlight the need to take child and caregiver perceptions of QOL into account in clinical practice and research. It strongly indicates the need for culture-specific measures of QOL.

Keywords: cerebral palsy, CPQOL, culture, quality of life

Procedia PDF Downloads 336
114 Instant Data-Driven Robotics Fabrication of Light-Transmitting Ceramics: A Responsive Computational Modeling Workflow

Authors: Shunyi Yang, Jingjing Yan, Siyu Dong, Xiangguo Cui

Abstract:

Current architectural façade design practices incorporate various daylighting and solar radiation analysis methods. These emphasize the impact of geometry on façade design. There is scope to extend this knowledge into methods that address material translucency, porosity, and form. Such approaches can also achieve these conditions through adaptive robotic manufacturing approaches that exploit material dynamics within the design, and alleviate fabrication waste from molds, ultimately accelerating the autonomous manufacturing system. Besides analyzing the environmental solar radiant in building facade design, there is also a vacancy research area of how lighting effects can be precisely controlled by engaging the instant real-time data-driven robot control and manipulating the material properties. Ceramics carries a wide range of transmittance and deformation potentials for robotics control with the research of its material property. This paper presents one semi-autonomous system that engages with real-time data-driven robotics control, hardware kit design, environmental building studies, human interaction, and exploratory research and experiments. Our objectives are to investigate the relationship between different clay bodies or ceramics’ physio-material properties and their transmittance; to explore the feedback system of instant lighting data in robotic fabrication to achieve precise lighting effect; to design the sufficient end effector and robot behaviors for different stages of deformation. We experiment with architectural clay, as the material of the façade that is potentially translucent at a certain stage can respond to light. Studying the relationship between form, material properties, and porosity can help create different interior and exterior light effects and provide façade solutions for specific architectural functions. The key idea is to maximize the utilization of in-progress robotics fabrication and ceramics materiality to create a highly integrated autonomous system for lighting facade design and manufacture.

Keywords: light transmittance, data-driven fabrication, computational design, computer vision, gamification for manufacturing

Procedia PDF Downloads 104
113 Handy EKG: Low-Cost ECG For Primary Care Screening In Developing Countries

Authors: Jhiamluka Zservando Solano Velasquez, Raul Palma, Alejandro Calderon, Servio Paguada, Erick Marin, Kellyn Funes, Hana Sandoval, Oscar Hernandez

Abstract:

Background: Screening cardiac conditions in primary care in developing countries can be challenging, and Honduras is not the exception. One of the main limitations is the underfunding of the Healthcare System in general, causing conventional ECG acquisition to become a secondary priority. Objective: Development of a low-cost ECG to improve screening of arrhythmias in primary care and communication with a specialist in secondary and tertiary care. Methods: Design a portable, pocket-size low-cost 3 lead ECG (Handy EKG). The device is autonomous and has Wi-Fi/Bluetooth connectivity options. A mobile app was designed which can access online servers with machine learning, a subset of artificial intelligence to learn from the data and aid clinicians in their interpretation of readings. Additionally, the device would use the online servers to transfer patient’s data and readings to a specialist in secondary and tertiary care. 50 randomized patients volunteer to participate to test the device. The patients had no previous cardiac-related conditions, and readings were taken. One reading was performed with the conventional ECG and 3 readings with the Handy EKG using different lead positions. This project was possible thanks to the funding provided by the National Autonomous University of Honduras. Results: Preliminary results show that the Handy EKG performs readings of the cardiac activity similar to those of a conventional electrocardiograph in lead I, II, and III depending on the position of the leads at a lower cost. The wave and segment duration, amplitude, and morphology of the readings were similar to the conventional ECG, and interpretation was possible to conclude whether there was an arrhythmia or not. Two cases of prolonged PR segment were found in both ECG device readings. Conclusion: Using a Frugal innovation approach can allow lower income countries to develop innovative medical devices such as the Handy EKG to fulfill unmet needs at lower prices without compromising effectiveness, safety, and quality. The Handy EKG provides a solution for primary care screening at a much lower cost and allows for convenient storage of the readings in online servers where clinical data of patients can then be accessed remotely by Cardiology specialists.

Keywords: low-cost hardware, portable electrocardiograph, prototype, remote healthcare

Procedia PDF Downloads 170
112 A Theragnostic Approach for Alzheimer’s Disease Focused on Phosphorylated Tau

Authors: Tomás Sobrino, Lara García-Varela, Marta Aramburu-Núñez, Mónica Castro, Noemí Gómez-Lado, Mariña Rodríguez-Arrizabalaga, Antía Custodia, Juan Manuel Pías-Peleteiro, José Manuel Aldrey, Daniel Romaus-Sanjurjo, Ángeles Almeida, Pablo Aguiar, Alberto Ouro

Abstract:

Introduction: Alzheimer’s disease (AD) and other tauopathies are primary causes of dementia, causing progressive cognitive deterioration that entails serious repercussions for the patients' performance of daily tasks. Currently, there is no effective approach for the early diagnosis and treatment of AD and tauopathies. This study suggests a theragnostic approach based on the importance of phosphorylated tau protein (p-Tau) in the early pathophysiological processes of AD. We have developed a novel theragnostic monoclonal antibody (mAb) to provide both diagnostic and therapeutic effects. Methods/Results: We have developed a p-Tau mAb, which was doped with deferoxamine for radiolabeling with Zirconium-89 (89Zr) for PET imaging, as well as fluorescence dies for immunofluorescence assays. The p-Tau mAb was evaluated in vitro for toxicity by MTT assay, LDH activity, propidium iodide/Annexin V assay, caspase-3, and mitochondrial membrane potential (MMP) assay in both mouse endothelial cell line (bEnd.3) and cortical primary neurons cell cultures. Importantly, non-toxic effects (up to concentrations of p-Tau mAb greater than 100 ug/mL) were detected. In vivo experiments in the tauopathy model mice (PS19) show that the 89Zr-pTau-mAb and 89Zr-Fragments-pTau-mAb are stable in circulation for up to 10 days without toxic effects. However, only less than 0.2% reached the brain, so further strategies have to be designed for crossing the Brain-Blood-Barrier (BBB). Moreover, an intraparenchymal treatment strategy was carried out. The PS19 mice were operated to implement osmotic pumps (Alzet 1004) at two different times, at 4 and 7 months, to stimulate the controlled release for one month each of the B6 antibody or the IgG1 control antibody. We demonstrated that B6-treated mice maintained their motor and memory abilities significantly compared with IgG1 treatment. In addition, we observed a significant reduction in p-Tau deposits in the brain. Conclusions /Discussion: A theragnostic pTau-mAb was developed. Moreover, we demonstrated that our p-Tau mAb recognizes very-early pathology forms of p-Tau by non-invasive techniques, such as PET. In addition, p-Tau mAb has non-toxic effects, both in vitro and in vivo. Although the p-Tau mAb is stable in circulation, only 0.2% achieve the brain. However, direct intraventricular treatment significantly reduces cognitive impairment in Alzheimer's animal models, as well as the accumulation of toxic p-Tau species.

Keywords: alzheimer's disease, theragnosis, tau, PET, immunotherapy, tauopathies

Procedia PDF Downloads 58
111 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed

Authors: Marion G. Ben-Jacob, David Wang

Abstract:

There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.

Keywords: emporium model, mathematics, pedagogy, STEM

Procedia PDF Downloads 59
110 Study the Effect of Liquefaction on Buried Pipelines during Earthquakes

Authors: Mohsen Hababalahi, Morteza Bastami

Abstract:

Buried pipeline damage correlations are critical part of loss estimation procedures applied to lifelines for future earthquakes. The vulnerability of buried pipelines against earthquake and liquefaction has been observed during some of previous earthquakes and there are a lot of comprehensive reports about this event. One of the main reasons for impairment of buried pipelines during earthquake is liquefaction. Necessary conditions for this phenomenon are loose sandy soil, saturation of soil layer and earthquake intensity. Because of this fact that pipelines structure are very different from other structures (being long and having light mass) by paying attention to the results of previous earthquakes and compare them with other structures, it is obvious that the danger of liquefaction for buried pipelines is not high risked, unless effective parameters like earthquake intensity and non-dense soil and other factors be high. Recent liquefaction researches for buried pipeline include experimental and theoretical ones as well as damage investigations during actual earthquakes. The damage investigations have revealed that a damage ratio of pipelines (Number/km ) has much larger values in liquefied grounds compared with one in shaking grounds without liquefaction according to damage statistics during past severe earthquakes, and that damages of joints and pipelines connected with manholes were remarkable. The purpose of this research is numerical study of buried pipelines under the effect of liquefaction by case study of the 2013 Dashti (Iran) earthquake. Water supply and electrical distribution systems of this township interrupted during earthquake and water transmission pipelines were damaged severely due to occurrence of liquefaction. The model consists of a polyethylene pipeline with 100 meters length and 0.8 meter diameter which is covered by light sandy soil and the depth of burial is 2.5 meters from surface. Since finite element method is used relatively successfully in order to solve geotechnical problems, we used this method for numerical analysis. For evaluating this case, some information like geotechnical information, classification of earthquakes levels, determining the effective parameters in probability of liquefaction, three dimensional numerical finite element modeling of interaction between soil and pipelines are necessary. The results of this study on buried pipelines indicate that the effect of liquefaction is function of pipe diameter, type of soil, and peak ground acceleration. There is a clear increase in percentage of damage with increasing the liquefaction severity. The results indicate that although in this form of the analysis, the damage is always associated to a certain pipe material, but the nominally defined “failures” include by failures of particular components (joints, connections, fire hydrant details, crossovers, laterals) rather than material failures. At the end, there are some retrofit suggestions in order to decrease the risk of liquefaction on buried pipelines.

Keywords: liquefaction, buried pipelines, lifelines, earthquake, finite element method

Procedia PDF Downloads 501
109 A Radiofrequency Based Navigation Method for Cooperative Robotic Communities in Surface Exploration Missions

Authors: Francisco J. García-de-Quirós, Gianmarco Radice

Abstract:

When considering small robots working in a cooperative community for Moon surface exploration, navigation and inter-nodes communication aspects become a critical issue for the mission success. For this approach to succeed, it is necessary however to deploy the required infrastructure for the robotic community to achieve efficient self-localization as well as relative positioning and communications between nodes. In this paper, an exploration mission concept in which two cooperative robotic systems co-exist is presented. This paradigm hinges on a community of reference agents that provide support in terms of communication and navigation to a second agent community tasked with exploration goals. The work focuses on the role of the agent community in charge of the overall support and, more specifically, will focus on the positioning and navigation methods implemented in RF microwave bands, which are combined with the communication services. An analysis of the different methods for range and position calculation are presented, as well as the main limiting factors for precision and resolution, such as phase and frequency noise in RF reference carriers and drift mechanisms such as thermal drift and random walk. The effects of carrier frequency instability due to phase noise are categorized in different contributing bands, and the impact of these spectrum regions are considered both in terms of the absolute position and the relative speed. A mission scenario is finally proposed, and key metrics in terms of mass and power consumption for the required payload hardware are also assessed. For this purpose, an application case involving an RF communication network in UHF Band is described, in coexistence with a communications network used for the single agents to communicate within the both the exploring agents as well as the community and with the mission support agents. The proposed approach implements a substantial improvement in planetary navigation since it provides self-localization capabilities for robotic agents characterized by very low mass, volume and power budgets, thus enabling precise navigation capabilities to agents of reduced dimensions. Furthermore, a common and shared localization radiofrequency infrastructure enables new interaction mechanisms such as spatial arrangement of agents over the area of interest for distributed sensing.

Keywords: cooperative robotics, localization, robot navigation, surface exploration

Procedia PDF Downloads 276
108 A Comparison of Inverse Simulation-Based Fault Detection in a Simple Robotic Rover with a Traditional Model-Based Method

Authors: Murray L. Ireland, Kevin J. Worrall, Rebecca Mackenzie, Thaleia Flessa, Euan McGookin, Douglas Thomson

Abstract:

Robotic rovers which are designed to work in extra-terrestrial environments present a unique challenge in terms of the reliability and availability of systems throughout the mission. Should some fault occur, with the nearest human potentially millions of kilometres away, detection and identification of the fault must be performed solely by the robot and its subsystems. Faults in the system sensors are relatively straightforward to detect, through the residuals produced by comparison of the system output with that of a simple model. However, faults in the input, that is, the actuators of the system, are harder to detect. A step change in the input signal, caused potentially by the loss of an actuator, can propagate through the system, resulting in complex residuals in multiple outputs. These residuals can be difficult to isolate or distinguish from residuals caused by environmental disturbances. While a more complex fault detection method or additional sensors could be used to solve these issues, an alternative is presented here. Using inverse simulation (InvSim), the inputs and outputs of the mathematical model of the rover system are reversed. Thus, for a desired trajectory, the corresponding actuator inputs are obtained. A step fault near the input then manifests itself as a step change in the residual between the system inputs and the input trajectory obtained through inverse simulation. This approach avoids the need for additional hardware on a mass- and power-critical system such as the rover. The InvSim fault detection method is applied to a simple four-wheeled rover in simulation. Additive system faults and an external disturbance force and are applied to the vehicle in turn, such that the dynamic response and sensor output of the rover are impacted. Basic model-based fault detection is then employed to provide output residuals which may be analysed to provide information on the fault/disturbance. InvSim-based fault detection is then employed, similarly providing input residuals which provide further information on the fault/disturbance. The input residuals are shown to provide clearer information on the location and magnitude of an input fault than the output residuals. Additionally, they can allow faults to be more clearly discriminated from environmental disturbances.

Keywords: fault detection, ground robot, inverse simulation, rover

Procedia PDF Downloads 291
107 RA-Apriori: An Efficient and Faster MapReduce-Based Algorithm for Frequent Itemset Mining on Apache Flink

Authors: Sanjay Rathee, Arti Kashyap

Abstract:

Extraction of useful information from large datasets is one of the most important research problems. Association rule mining is one of the best methods for this purpose. Finding possible associations between items in large transaction based datasets (finding frequent patterns) is most important part of the association rule mining. There exist many algorithms to find frequent patterns but Apriori algorithm always remains a preferred choice due to its ease of implementation and natural tendency to be parallelized. Many single-machine based Apriori variants exist but massive amount of data available these days is above capacity of a single machine. Therefore, to meet the demands of this ever-growing huge data, there is a need of multiple machines based Apriori algorithm. For these types of distributed applications, MapReduce is a popular fault-tolerant framework. Hadoop is one of the best open-source software frameworks with MapReduce approach for distributed storage and distributed processing of huge datasets using clusters built from commodity hardware. However, heavy disk I/O operation at each iteration of a highly iterative algorithm like Apriori makes Hadoop inefficient. A number of MapReduce-based platforms are being developed for parallel computing in recent years. Among them, two platforms, namely, Spark and Flink have attracted a lot of attention because of their inbuilt support to distributed computations. Earlier we proposed a reduced- Apriori algorithm on Spark platform which outperforms parallel Apriori, one because of use of Spark and secondly because of the improvement we proposed in standard Apriori. Therefore, this work is a natural sequel of our work and targets on implementing, testing and benchmarking Apriori and Reduced-Apriori and our new algorithm ReducedAll-Apriori on Apache Flink and compares it with Spark implementation. Flink, a streaming dataflow engine, overcomes disk I/O bottlenecks in MapReduce, providing an ideal platform for distributed Apriori. Flink's pipelining based structure allows starting a next iteration as soon as partial results of earlier iteration are available. Therefore, there is no need to wait for all reducers result to start a next iteration. We conduct in-depth experiments to gain insight into the effectiveness, efficiency and scalability of the Apriori and RA-Apriori algorithm on Flink.

Keywords: apriori, apache flink, Mapreduce, spark, Hadoop, R-Apriori, frequent itemset mining

Procedia PDF Downloads 278
106 The Effects of a Hippotherapy Simulator in Children with Cerebral Palsy: A Pilot Study

Authors: Canan Gunay Yazici, Zubeyir Sarı, Devrim Tarakci

Abstract:

Background: Hippotherapy considered as global techniques used in rehabilitation of children with cerebral palsy as it improved gait pattern, balance, postural control, balance and gross motor skills development but it encounters some problems (such as the excess of the cost of horses' care, nutrition, housing). Hippotherapy simulator is being developed in recent years to overcome these problems. These devices aim to create the effects of hippotherapy made with a real horse on patients by simulating the movements of a real horse. Objectives: To evaluate the efficacy of hippotherapy simulator on gross motor functions, sitting postural control and dynamic balance of children with cerebral palsy (CP). Methods: Fourteen children with CP, aged 6–15 years, seven with a diagnosis of spastic hemiplegia, five of diplegia, two of triplegia, Gross Motor Function Classification System level I-III. The Horse Riding Simulator (HRS), including four-speed program (warm-up, level 1-2-3), was used for hippotherapy simulator. Firstly, each child received Neurodevelopmental Therapy (NDT; 45min twice weekly eight weeks). Subsequently, the same children completed HRS+NDT (30min and 15min respectively, twice weekly eight weeks). Children were assessed pre-treatment, at the end of 8th and 16th week. Gross motor function, sitting postural control, dynamic sitting and standing balance were evaluated by Gross Motor Function Measure-88 (GMFM-88, Dimension B, D, E and Total Score), Trunk Impairment Scale (TIS), Pedalo® Sensamove Balance Test and Pediatric Balance Scale (PBS) respectively. Unit of Scientific Research Project of Marmara University supported our study. Results: All measured variables were a significant increase compared to baseline values after both intervention (NDT and HRS+NDT), except for dynamic sitting balance evaluated by Pedalo®. Especially HRS+NDT, increase in the measured variables was considerably higher than NDT. After NDT, the Total scores of GMFM-88 (mean baseline 62,2 ± 23,5; mean NDT: 66,6 ± 22,2; p < 0,05), TIS (10,4 ± 3,4; 12,1 ± 3; p < 0,05), PBS (37,4 ± 14,6; 39,6 ± 12,9; p < 0,05), Pedalo® sitting (91,2 ± 6,7; 92,3 ± 5,2; p > 0,05) and Pedalo® standing balance points (80,2 ± 10,8; 82,5 ± 11,5; p < 0,05) increased by 7,1%, 2%, 3,9%, 5,2% and 6 % respectively. After HRS+NDT treatment, the total scores of GMFM-88 (mean baseline: 62,2 ± 23,5; mean HRS+NDT: 71,6 ± 21,4; p < 0,05), TIS (10,4 ± 3,4; 15,6 ± 2,9; p < 0,05), PBS (37,4 ± 14,6; 42,5 ± 12; p < 0,05), Pedalo® sitting (91,2 ± 6,7; 93,8 ± 3,7; p > 0,05) and standing balance points (80,2 ± 10,8; 86,2 ± 5,6; p < 0,05) increased by 15,2%, 6%, 7,3%, 6,4%, and 11,9%, respectively, compared to the initial values. Conclusion: Neurodevelopmental therapy provided significant improvements in gross motor functions, sitting postural control, sitting and standing balance of children with CP. When the hippotherapy simulator added to the treatment program, it was observed that these functions were further developed (especially with gross motor functions and dynamic balance). As a result, this pilot study showed that the hippotherapy simulator could be a useful alternative to neurodevelopmental therapy for the improvement of gross motor function, sitting postural control and dynamic balance of children with CP.

Keywords: balance, cerebral palsy, hippotherapy, rehabilitation

Procedia PDF Downloads 129
105 Switching of Series-Parallel Connected Modules in an Array for Partially Shaded Conditions in a Pollution Intensive Area Using High Powered MOSFETs

Authors: Osamede Asowata, Christo Pienaar, Johan Bekker

Abstract:

Photovoltaic (PV) modules may become a trend for future PV systems because of their greater flexibility in distributed system expansion, easier installation due to their nature, and higher system-level energy harnessing capabilities under shaded or PV manufacturing mismatch conditions. This is as compared to the single or multi-string inverters. Novel residential scale PV arrays are commonly connected to the grid by a single DC–AC inverter connected to a series, parallel or series-parallel string of PV panels, or many small DC–AC inverters which connect one or two panels directly to the AC grid. With an increasing worldwide interest in sustainable energy production and use, there is renewed focus on the power electronic converter interface for DC energy sources. Three specific examples of such DC energy sources that will have a role in distributed generation and sustainable energy systems are the photovoltaic (PV) panel, the fuel cell stack, and batteries of various chemistries. A high-efficiency inverter using Metal Oxide Semiconductor Field-Effect Transistors (MOSFETs) for all active switches is presented for a non-isolated photovoltaic and AC-module applications. The proposed configuration features a high efficiency over a wide load range, low ground leakage current and low-output AC-current distortion with no need for split capacitors. The detailed power stage operating principles, pulse width modulation scheme, multilevel bootstrap power supply, and integrated gate drivers for the proposed inverter is described. Experimental results of a hardware prototype, show that not only are MOSFET efficient in the system, it also shows that the ground leakage current issues are alleviated in the proposed inverter and also a 98 % maximum associated driver circuit is achieved. This, in turn, provides the need for a possible photovoltaic panel switching technique. This will help to reduce the effect of cloud movements as well as improve the overall efficiency of the system.

Keywords: grid connected photovoltaic (PV), Matlab efficiency simulation, maximum power point tracking (MPPT), module integrated converters (MICs), multilevel converter, series connected converter

Procedia PDF Downloads 112
104 Analyzing Electromagnetic and Geometric Characterization of Building Insulation Materials Using the Transient Radar Method (TRM)

Authors: Ali Pourkazemi

Abstract:

The transient radar method (TRM) is one of the non-destructive methods that was introduced by authors a few years ago. The transient radar method can be classified as a wave-based non destructive testing (NDT) method that can be used in a wide frequency range. Nevertheless, it requires a narrow band, ranging from a few GHz to a few THz, depending on the application. As a time-of-flight and real-time method, TRM can measure the electromagnetic properties of the sample under test not only quickly and accurately, but also blindly. This means that it requires no prior knowledge of the sample under test. For multi-layer structures, TRM is not only able to detect changes related to any parameter within the multi-layer structure but can also measure the electromagnetic properties of each layer and its thickness individually. Although the temperature, humidity, and general environmental conditions may affect the sample under test, they do not affect the accuracy of the Blind TRM algorithm. In this paper, the electromagnetic properties as well as the thickness of the individual building insulation materials - as a single-layer structure - are measured experimentally. Finally, the correlation between the reflection coefficients and some other technical parameters such as sound insulation, thermal resistance, thermal conductivity, compressive strength, and density is investigated. The sample to be studied is 30 cm x 50 cm and the thickness of the samples varies from a few millimeters to 6 centimeters. This experiment is performed with both biostatic and differential hardware at 10 GHz. Since it is a narrow-band system, high-speed computation for analysis, free-space application, and real-time sensor, it has a wide range of potential applications, e.g., in the construction industry, rubber industry, piping industry, wind energy industry, automotive industry, biotechnology, food industry, pharmaceuticals, etc. Detection of metallic, plastic pipes wires, etc. through or behind the walls are specific applications for the construction industry.

Keywords: transient radar method, blind electromagnetic geometrical parameter extraction technique, ultrafast nondestructive multilayer dielectric structure characterization, electronic measurement systems, illumination, data acquisition performance, submillimeter depth resolution, time-dependent reflected electromagnetic signal blind analysis method, EM signal blind analysis method, time domain reflectometer, microwave, milimeter wave frequencies

Procedia PDF Downloads 62
103 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions

Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini

Abstract:

This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.

Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing

Procedia PDF Downloads 133
102 Female Autism Spectrum Disorder and Understanding Rigid Repetitive Behaviors

Authors: Erin Micali, Katerina Tolstikova, Cheryl Maykel, Elizabeth Harwood

Abstract:

Female ASD is seldomly studied separately from males. Further, females with ASD are disproportionately underrepresented in the research at a rate of 3:1 (male to female). As such, much of the current understanding about female rigid repetitive behaviors (RRBs) stems from research’s understanding of male RRBs. This can be detrimental to understanding female ASD because this understanding of female RRBs largely discounts female camouflaging and the possibility that females present their autistic symptoms differently. Current literature suggests that females with ASD engage in fewer RRBs than males with ASD and when females do engage in RRBs, they are likely to engage in more subtle, less overt obsessions and repetitive behaviors than males. Method: The current study utilized a mixed methods research design to identify the type and frequency of RRBs that females with ASD engaged in by using a cross-sectional design. The researcher recruited only females to be part of the present study with the criteria they be at least age six and not have co-occurring cognitive impairment. Results: The researcher collected previous testing data (Autism Diagnostic Interview-Revised (ADI-R), Child or Adolescent/Adult Sensory Profile-2, Autism/ Empathy Quotient, Yale Brown Obsessive Compulsive Checklist, Rigid Repetitive Behavior Checklist (evaluator created list), and demographic questionnaire) from 25 total participants. The participants ages ranged from six to 52. The participants were 96% Caucasion and 4% Latin American. Qualitative analysis found the current participant pool engaged in six RRB themes including repetitive behaviors, socially restrictive behaviors, repetitive speech, difficulty with transition, obsessive behaviors, and restricted interests. The current dataset engaged in socially restrictive behaviors and restrictive interests most frequently. Within the main themes 40 subthemes were isolated, defined, and analyzed. Further, preliminary quantitative analysis was run to determine if age impacted camouflaging behaviors and overall presentation of RRBs. Within this dataset this was not founded. Further qualitative data will be run to determine if this dataset engaged in more overt or subtle RRBs to confirm or rebuff previous research. The researcher intends to run SPSS analysis to determine if there was statistical difference between each RRB theme and overall presentation. Secondly, each participant will be analyzed for presentation of RRB, age, and previous diagnoses. Conclusion: The present study aimed to assist in diagnostic clarity. This was achieved by collecting data from a female only participant pool across the lifespan. Current data aided in clarity of the type of RRBs engage in. A limited sample size was a barrier in this study.

Keywords: autism spectrum disorder, camouflaging, rigid repetitive behaviors, gender disparity

Procedia PDF Downloads 125
101 Qualitative Characterization of Proteins in Common and Quality Protein Maize Corn by Mass Spectrometry

Authors: Benito Minjarez, Jesse Haramati, Yury Rodriguez-Yanez, Florencio Recendiz-Hurtado, Juan-Pedro Luna-Arias, Salvador Mena-Munguia

Abstract:

During the last decades, the world has experienced a rapid industrialization and an expanding economy favoring a demographic boom. As a consequence, countries around the world have focused on developing new strategies related to the production of different farm products in order to meet future demands. Consequently, different strategies have been developed seeking to improve the major food products for both humans and livestock. Corn, after wheat and rice, is the third most important crop globally and is the primary food source for both humans and livestock in many regions around the globe. In addition, maize (Zea mays) is an important source of protein accounting for up to 60% of the daily human protein supply. Generally, many of the cereal grains have proteins with relatively low nutritional value, when they are compared with proteins from meat. In the case of corn, much of the protein is found in the endosperm (75 to 85%) and is deficient in two essential amino acids, lysine, and tryptophan. This deficiency results in an imbalance of amino acids and low protein content; normal maize varieties have less than half of the recommended amino acids for human nutrition. In addition, studies have shown that this deficiency has been associated with symptoms of growth impairment, anemia, hypoproteinemia, and fatty liver. Due to the fact that most of the presently available maize varieties do not contain the quality and quantity of proteins necessary for a balanced diet, different countries have focused on the research of quality protein maize (QPM). Researchers have characterized QPM noting that these varieties may contain between 70 to 100% more residues of the amino acids essential for animal and human nutrition, lysine, and tryptophan, than common corn. Several countries in Africa, Latin America, as well as China, have incorporated QPM in their agricultural development plan. Large parts of these countries have chosen a specific QPM variety based on their local needs and climate. Reviews have described the breeding methods of maize and have revealed the lack of studies on genetic and proteomic diversity of proteins in QPM varieties, and their genetic relationships with normal maize varieties. Therefore, molecular marker identification using tools such as mass spectrometry may accelerate the selection of plants that carry the desired proteins with high lysine and tryptophan concentration. To date, QPM maize lines have played a very important role in alleviating the malnutrition, and better characterization of these lines would provide a valuable nutritional enhancement for use in the resource-poor regions of the world. Thus, the objectives of this study were to identify proteins in QPM maize in comparison with a common maize line as a control.

Keywords: corn, mass spectrometry, QPM, tryptophan

Procedia PDF Downloads 274
100 Uncertainty Quantification of Fuel Compositions on Premixed Bio-Syngas Combustion at High-Pressure

Authors: Kai Zhang, Xi Jiang

Abstract:

Effect of fuel variabilities on premixed combustion of bio-syngas mixtures is of great importance in bio-syngas utilisation. The uncertainties of concentrations of fuel constituents such as H2, CO and CH4 may lead to unpredictable combustion performances, combustion instabilities and hot spots which may deteriorate and damage the combustion hardware. Numerical modelling and simulations can assist in understanding the behaviour of bio-syngas combustion with pre-defined species concentrations, while the evaluation of variabilities of concentrations is expensive. To be more specific, questions such as ‘what is the burning velocity of bio-syngas at specific equivalence ratio?’ have been answered either experimentally or numerically, while questions such as ‘what is the likelihood of burning velocity when precise concentrations of bio-syngas compositions are unknown, but the concentration ranges are pre-described?’ have not yet been answered. Uncertainty quantification (UQ) methods can be used to tackle such questions and assess the effects of fuel compositions. An efficient probabilistic UQ method based on Polynomial Chaos Expansion (PCE) techniques is employed in this study. The method relies on representing random variables (combustion performances) with orthogonal polynomials such as Legendre or Gaussian polynomials. The constructed PCE via Galerkin Projection provides easy access to global sensitivities such as main, joint and total Sobol indices. In this study, impacts of fuel compositions on combustion (adiabatic flame temperature and laminar flame speed) of bio-syngas fuel mixtures are presented invoking this PCE technique at several equivalence ratios. High-pressure effects on bio-syngas combustion instability are obtained using detailed chemical mechanism - the San Diego Mechanism. Guidance on reducing combustion instability from upstream biomass gasification process is provided by quantifying the significant contributions of composition variations to variance of physicochemical properties of bio-syngas combustion. It was found that flame speed is very sensitive to hydrogen variability in bio-syngas, and reducing hydrogen uncertainty from upstream biomass gasification processes can greatly reduce bio-syngas combustion instability. Variation of methane concentration, although thought to be important, has limited impacts on laminar flame instabilities especially for lean combustion. Further studies on the UQ of percentage concentration of hydrogen in bio-syngas can be conducted to guide the safer use of bio-syngas.

Keywords: bio-syngas combustion, clean energy utilisation, fuel variability, PCE, targeted uncertainty reduction, uncertainty quantification

Procedia PDF Downloads 266
99 Laser-Dicing Modeling: Implementation of a High Accuracy Tool for Laser-Grooving and Cutting Application

Authors: Jeff Moussodji, Dominique Drouin

Abstract:

The highly complex technology requirements of today’s integrated circuits (ICs), lead to the increased use of several materials types such as metal structures, brittle and porous low-k materials which are used in both front end of line (FEOL) and back end of line (BEOL) process for wafer manufacturing. In order to singulate chip from wafer, a critical laser-grooving process, prior to blade dicing, is used to remove these layers of materials out of the dicing street. The combination of laser-grooving and blade dicing allows to reduce the potential risk of induced mechanical defects such micro-cracks, chipping, on the wafer top surface where circuitry is located. It seems, therefore, essential to have a fundamental understanding of the physics involving laser-dicing in order to maximize control of these critical process and reduce their undesirable effects on process efficiency, quality, and reliability. In this paper, the study was based on the convergence of two approaches, numerical and experimental studies which allowed us to investigate the interaction of a nanosecond pulsed laser and BEOL wafer materials. To evaluate this interaction, several laser grooved samples were compared with finite element modeling, in which three different aspects; phase change, thermo-mechanical and optic sensitive parameters were considered. The mathematical model makes it possible to highlight a groove profile (depth, width, etc.) of a single pulse or multi-pulses on BEOL wafer material. Moreover, the heat affected zone, and thermo-mechanical stress can be also predicted as a function of laser operating parameters (power, frequency, spot size, defocus, speed, etc.). After modeling validation and calibration, a satisfying correlation between experiment and modeling, results have been observed in terms of groove depth, width and heat affected zone. The study proposed in this work is a first step toward implementing a quick assessment tool for design and debug of multiple laser grooving conditions with limited experiments on hardware in industrial application. More correlations and validation tests are in progress and will be included in the full paper.

Keywords: laser-dicing, nano-second pulsed laser, wafer multi-stack, multiphysics modeling

Procedia PDF Downloads 196
98 Efficacy Testing of a Product in Reducing Facial Hyperpigmentation and Photoaging after a 12-Week Use

Authors: Nalini Kaul, Barrie Drewitt, Elsie Kohoot

Abstract:

Hyperpigmentation is the third most common pigmentary disorder where dermatologic treatment is sought. It affects all ages resulting in skin darkening because of melanin accumulation. An uneven skin tone because of either exposure to the sun (solar lentigos/age spots/sun spots or skin disruption following acne, or rashes (post-inflammatory hyperpigmentation -PIH) or hormonal changes (melasma) can lead to significant psychosocial impairment. Dyschromia is a result of various alterations in biochemical processes regulating melanogenesis. Treatments include the daily use of sunscreen with lightening, brightening, and exfoliating products. Depigmentation is achieved by various depigmenting agents: common examples are hydroquinone, arbutin, azelaic acid, aloesin, mulberry, licorice extracts, kojic acid, niacinamide, ellagic acid, arbutin, green tea, turmeric, soy, ascorbic acid, and tranexamic acid. These agents affect pigmentation by interfering with mechanisms before, during, and after melanin synthesis. While immediate correction is much sought after, patience and diligence are key. Our objective was to assess the effects of a facial product with pigmentation treatment and UV protection in 35 healthy F (35-65y), meeting the study criteria. Subjects with mild to moderate hyperpigmentation and fine lines with no use of skin-lightening products in the last six months or any dermatological procedures in the last twelve months before the study started were included. Efficacy parameters included expert clinical grading for hyperpigmentation, radiance, skin tone & smoothness, fine lines, and wrinkles bioinstrumentation (Corneometer®, Colorimeter®), digital photography and imaging (Visia-CR®), and self-assessment questionnaires. Safety included grading for erythema, edema, dryness & peeling and self-assessments for itching, stinging, tingling, and burning. Our results showed statistically significant improvement in clinical grading scores, bioinstrumentation, and digital photos for hyperpigmentation-brown spots, fine lines/wrinkles, skin tone, radiance, pores, skin smoothness, and overall appearance compared to baseline. The product was also well-tolerated and liked by subjects. Conclusion: Facial hyperpigmentation is of great concern, and treatment strategies are increasingly sought. Clinical trials with both subjective and objective assessments, imaging analyses, and self-perception are essential to distinguish evidence-based products. The multifunctional cosmetic product tested in this clinical study showed efficacy, tolerability, and subject satisfaction in reducing hyperpigmentation and global photoaging.

Keywords: hyperpigmentation; photoaging, clinical testing, expert visual evaluations, bio-instruments

Procedia PDF Downloads 63
97 A Method and System for Secure Authentication Using One Time QR Code

Authors: Divyans Mahansaria

Abstract:

User authentication is an important security measure for protecting confidential data and systems. However, the vulnerability while authenticating into a system has significantly increased. Thus, necessary mechanisms must be deployed during the process of authenticating a user to safeguard him/her from the vulnerable attacks. The proposed solution implements a novel authentication mechanism to counter various forms of security breach attacks including phishing, Trojan horse, replay, key logging, Asterisk logging, shoulder surfing, brute force search and others. QR code (Quick Response Code) is a type of matrix barcode or two-dimensional barcode that can be used for storing URLs, text, images and other information. In the proposed solution, during each new authentication request, a QR code is dynamically generated and presented to the user. A piece of generic information is mapped to plurality of elements and stored within the QR code. The mapping of generic information with plurality of elements, randomizes in each new login, and thus the QR code generated for each new authentication request is for one-time use only. In order to authenticate into the system, the user needs to decode the QR code using any QR code decoding software. The QR code decoding software needs to be installed on handheld mobile devices such as smartphones, personal digital assistant (PDA), etc. On decoding the QR code, the user will be presented a mapping between the generic piece of information and plurality of elements using which the user needs to derive cipher secret information corresponding to his/her actual password. Now, in place of the actual password, the user will use this cipher secret information to authenticate into the system. The authentication terminal will receive the cipher secret information and use a validation engine that will decipher the cipher secret information. If the entered secret information is correct, the user will be provided access to the system. Usability study has been carried out on the proposed solution, and the new authentication mechanism was found to be easy to learn and adapt. Mathematical analysis of the time taken to carry out brute force attack on the proposed solution has been carried out. The result of mathematical analysis showed that the solution is almost completely resistant to brute force attack. Today’s standard methods for authentication are subject to a wide variety of software, hardware, and human attacks. The proposed scheme can be very useful in controlling the various types of authentication related attacks especially in a networked computer environment where the use of username and password for authentication is common.

Keywords: authentication, QR code, cipher / decipher text, one time password, secret information

Procedia PDF Downloads 260
96 Mental Health on Three Continents: A Comparison of Mental Health Disorders in the Usa, India and Brazil

Authors: Henry Venter, Murali Thyloth, Alceu Casseb

Abstract:

Historically, mental and substance use disorders were not a global health priority. Since the 1993 World Development Report, the importance of the contribution of mental health and substance abuse on the relative global burden associated with disease morbidity has been recognized with 300 million people worldwide suffering from depression alone. This led to an international effort to improve the mental health of populations around the world. Despite these efforts some countries remain at the top of the list of countries with the highest rate of mental illness. Important research questions were asked: Would there be commonalities regarding mental health between these countries; would there be common factors leading to the high prevalence of mental illness; and how prepared are these countries with mental health delivery? Findings from this research can aid organizations and institutions preparing mental health service providers to focus training and preparation to address specific needs revealed by the study. Methods: Researchers decided to compare three distinctly different countries at the top of the list of countries with the highest rate of mental illness, the USA, India and Brazil, situated on three different continents with different economies and lifestyles. Data were collected using archival research methodology, reviewing records and findings of international and national health and mental health studies to subtract and compare data and findings. Results: The findings indicated that India is the most depressed country in the world, followed by the USA (and China) with Brazil in Latin America with the greatest number of depressed individuals. By 2020 roughly 20% of India, acountry of over one billion citizens, will suffer from some form of mental illnees, yet there are less than 4,000 experts available. In the USA 164.8 million people were substance abusers and an estimate of 47.6 million adults, 18 or older, had any mental illness in 2018. That means that about one in five adults in the USA experiences some form of mental illness each year, but only 41% of those affected received mental health care or services in the past year. Brazil has the greatest number of depressed individuals, in Latin America. Adults living in Sao Paulo megacity has prevalence of mental disorders at greater levels than similar surveys conducted in other areas of the world with more than one million adults with serious impairment levels. Discussion: The results show that, despite the vast socioeconomic differences between the three countries, there are correlations regarding mental health prevalence and difficulty to provide adequate services including a lack of awareness of how serious mental illness is, stigma for seeking mental illness, with comorbidity a common phenomenon, and a lack of partnership between different levels of service providers, which weakens mental health service delivery. The findings also indicate that mental health training institutions have a monumental task to prepare personnel to address the future mental health needs in each of the countries compared, which will constitute the next phase of the research.

Keywords: mental health epidemiology, mental health disorder, mental health prevalence, mental health treatment

Procedia PDF Downloads 101
95 Assessment of Influence of Short-Lasting Whole-Body Vibration on Joint Position Sense and Body Balance–A Randomised Masked Study

Authors: Anna Slupik, Anna Mosiolek, Sebastian Wojtowicz, Dariusz Bialoszewski

Abstract:

Introduction: Whole-body vibration (WBV) uses high frequency mechanical stimuli generated by a vibration plate and transmitted through bone, muscle and connective tissues to the whole body. Research has shown that long-term vibration-plate training improves neuromuscular facilitation, especially in afferent neural pathways, responsible for the conduction of vibration and proprioceptive stimuli, muscle function, balance and proprioception. Some researchers suggest that the vibration stimulus briefly inhibits the conduction of afferent signals from proprioceptors and can interfere with the maintenance of body balance. The aim of this study was to evaluate the influence of a single set of exercises associated with whole-body vibration on the joint position sense and body balance. Material and methods: The study enrolled 55 people aged 19-24 years. These individuals were randomly divided into a test group (30 persons) and a control group (25 persons). Both groups performed the same set of exercises on a vibration plate. The following vibration parameters: frequency of 20Hz and amplitude of 3mm, were used in the test group. The control group performed exercises on the vibration plate while it was off. All participants were instructed to perform six dynamic exercises lasting 30 seconds each with a 60-second period of rest between them. The exercises involved large muscle groups of the trunk, pelvis and lower limbs. Measurements were carried out before and immediately after exercise. Joint position sense (JPS) was measured in the knee joint for the starting position at 45° in an open kinematic chain. JPS error was measured using a digital inclinometer. Balance was assessed in a standing position with both feet on the ground with the eyes open and closed (each test lasting 30 sec). Balance was assessed using Matscan with FootMat 7.0 SAM software. The surface of the ellipse of confidence and front-back as well as right-left swing were measured to assess balance. Statistical analysis was performed using Statistica 10.0 PL software. Results: There were no significant differences between the groups, both before and after the exercise (p> 0.05). JPS did not change in both the test (10.7° vs. 8.4°) and control groups (9.0° vs. 8.4°). No significant differences were shown in any of the test parameters during balance tests with the eyes open or closed in both the test and control groups (p> 0.05). Conclusions. 1. Deterioration in proprioception or balance was not observed immediately after the vibration stimulus. This suggests that vibration-induced blockage of proprioceptive stimuli conduction can have only a short-lasting effect that occurs only as long as a vibration stimulus is present. 2. Short-term use of vibration in treatment does not impair proprioception and seems to be safe for patients with proprioceptive impairment. 3. These results need to be supplemented with an assessment of proprioception during the application of vibration stimuli. Additionally, the impact of vibration parameters used in the exercises should be evaluated.

Keywords: balance, joint position sense, proprioception, whole body vibration

Procedia PDF Downloads 315
94 Biomechanical Modeling, Simulation, and Comparison of Human Arm Motion to Mitigate Astronaut Task during Extra Vehicular Activity

Authors: B. Vadiraj, S. N. Omkar, B. Kapil Bharadwaj, Yash Vardhan Gupta

Abstract:

During manned exploration of space, missions will require astronaut crewmembers to perform Extra Vehicular Activities (EVAs) for a variety of tasks. These EVAs take place after long periods of operations in space, and in and around unique vehicles, space structures and systems. Considering the remoteness and time spans in which these vehicles will operate, EVA system operations should utilize common worksites, tools and procedures as much as possible to increase the efficiency of training and proficiency in operations. All of the preparations need to be carried out based on studies of astronaut motions. Until now, development and training activities associated with the planned EVAs in Russian and U.S. space programs have relied almost exclusively on physical simulators. These experimental tests are expensive and time consuming. During the past few years a strong increase has been observed in the use of computer simulations due to the fast developments in computer hardware and simulation software. Based on this idea, an effort to develop a computational simulation system to model human dynamic motion for EVA is initiated. This study focuses on the simulation of an astronaut moving the orbital replaceable units into the worksites or removing them from the worksites. Our physics-based methodology helps fill the gap in quantitative analysis of astronaut EVA by providing a multisegment human arm model. Simulation work described in the study improves on the realism of previous efforts, incorporating joint stops to account for the physiological limits of range of motion. To demonstrate the utility of this approach human arm model is simulated virtually using ADAMS/LifeMOD® software. Kinematic mechanism for the astronaut’s task is studied from joint angles and torques. Simulation results obtained is validated with numerical simulation based on the principles of Newton-Euler method. Torques determined using mathematical model are compared among the subjects to know the grace and consistency of the task performed. We conclude that due to uncertain nature of exploration-class EVA, a virtual model developed using multibody dynamics approach offers significant advantages over traditional human modeling approaches.

Keywords: extra vehicular activity, biomechanics, inverse kinematics, human body modeling

Procedia PDF Downloads 327
93 The Changing Role of Technology-Enhanced University Library Reform in Improving College Student Learning Experience and Career Readiness – A Qualitative Comparative Analysis (QCA)

Authors: Xiaohong Li, Wenfan Yan

Abstract:

Background: While it is widely considered that the university library plays a critical role in fulfilling the institution's mission and providing students’ learning experience beyond the classrooms, how the technology-enhanced library reform changed college students’ learning experience hasn’t been thoroughly investigated. The purpose of this study is to explore how technology-enhanced library reform affects students’ learning experience and career readiness and further identify the factors and effective conditions that enable the quality learning outcome of Chinese college students. Methodologies: This study selected the qualitative comparative analysis (QCA) method to explore the effects of technology-enhanced university library reform on college students’ learning experience and career readiness. QCA is unique in explaining the complex relationship between multiple factors from a holistic perspective. Compared with the traditional quantitative and qualitative analysis, QCA not only adds some quantitative logic but also inherits the characteristics of qualitative research focusing on the heterogeneity and complexity of samples. Shenyang Normal University (SNU) selected a sample of the typical comprehensive university in China that focuses on students’ learning and application of professional knowledge and trains professionals to different levels of expertise. A total of 22 current university students and 30 graduates who joined the Library Readers Association of SNU from 2011 to 2019 were selected for semi-structured interviews. Based on the data collected from these participating students, qualitative comparative analysis (QCA), including univariate necessity analysis and the multi-configuration analysis, was conducted. Findings and Discussion: QCA analysis results indicated that the influence of technology-enhanced university library restructures and reorganization on student learning experience and career readiness is the result of multiple factors. Technology-enhanced library equipment and other hardware restructured to meet the college students learning needs and have played an important role in improving the student learning experience and learning persistence. More importantly, the soft characteristics of technology-enhanced library reform, such as library service innovation space and culture space, have a positive impact on student’s career readiness and development. Technology-enhanced university library reform is not only the change in the building's appearance and facilities but also in library service quality and capability. The study also provides suggestions for policy, practice, and future research.

Keywords: career readiness, college student learning experience, qualitative comparative analysis (QCA), technology-enhanced library reform

Procedia PDF Downloads 69
92 Good Functional Outcome after Late Surgical Treatment for Traumatic Rotator Cuff Tear, a Retrospective Cohort Study

Authors: Soheila Zhaeentan, Anders Von Heijne, Elisabet Hagert, André Stark, Björn Salomonsson

Abstract:

Recommended treatment for traumatic rotator cuff tear (TRCT) is surgery within a few weeks after injury if the diagnosis is made early, especially if a functional impairment of the shoulder exists. This may lead to the assumption that a poor outcome then can be expected in delayed surgical treatment, when the patient is diagnosed at a later stage. The aim of this study was to investigate if a surgical repair later than three months after injury may result in successful outcomes and patient satisfaction. There is evidence in literature that good results of treatment can be expected up to three months after the injury, but little is known of later treatment with cuff repair. 73 patients (75 shoulders), 58 males/17 females, mean age 59 (range 34-­‐72), who had undergone surgical intervention for TRCT between January 1999 to December 2011 at our clinic, were included in this study. Patients were assessed by MRI investigation, clinical examination, Western Ontario Rotator Cuff index (WORC), Oxford Shoulder Score, Constant-­‐Murley Score, EQ-­‐5D and patient subjective satisfaction at follow-­‐up. The patients treated surgically within three months ( < 12 weeks) after injury (39 cases) were compared with patients treated more than three months ( ≥ 12 weeks) after injury (36 cases). WORC was used as the primary outcome measure and the other variables as secondary. A senior consultant radiologist, blinded to patient category and clinical outcome, evaluated all MRI-­‐images. Rotator cuff integrity, presence of arthritis, fatty degeneration and muscle atrophy was evaluated in all cases. The average follow-­‐up time was 56 months (range 14-­‐149) and the average time from injury to repair was 16 weeks (range 3-­‐104). No statistically significant differences were found for any of the assessed parameters or scores between the two groups. The mean WORC score was 77 (early group, range 25-­‐ 100 and late group, range 27-­‐100) for both groups (p= 0.86), Constant-­‐Murley Score (p= 0.91), Oxford Shoulder Score (p= 0.79), EQ-­‐5D index (p= 0.86). Re-­‐tear frequency was 24% for both groups, and the patients with re-­‐tear reported less satisfaction with outcome. Discussion and conclusion: This study shows that surgical repair of TRCT performed later than three months after injury may result in good functional outcomes and patient satisfaction. However, this does not motivate an intentional delay in surgery when there is an indication for surgical repair as that delay may adversely affect the possibility to perform a repair. Our results show that surgeons may safely consider surgical repair even if a delay in diagnosis has occurred. A retrospective cohort study on 75 shoulders shows good functional result after traumatic rotator cuff tear (TRCT) treated surgically up to one year after the injury.

Keywords: traumatic rotator cuff injury, time to surgery, surgical outcome, retrospective cohort study

Procedia PDF Downloads 212
91 Effects of Virtual Reality Treadmill Training on Gait and Balance Performance of Patients with Stroke: Review

Authors: Hanan Algarni

Abstract:

Background: Impairment of walking and balance skills has negative impact on functional independence and community participation after stroke. Gait recovery is considered a primary goal in rehabilitation by both patients and physiotherapists. Treadmill training coupled with virtual reality technology is a new emerging approach that offers patients with feedback, open and random skills practice while walking and interacting with virtual environmental scenes. Objectives: To synthesize the evidence around the effects of the VR treadmill training on gait speed and balance primarily, functional independence and community participation secondarily in stroke patients. Methods: Systematic review was conducted; search strategy included electronic data bases: MEDLINE, AMED, Cochrane, CINAHL, EMBASE, PEDro, Web of Science, and unpublished literature. Inclusion criteria: Participant: adult >18 years, stroke, ambulatory, without severe visual or cognitive impartments. Intervention: VR treadmill training alone or with physiotherapy. Comparator: any other interventions. Outcomes: gait speed, balance, function, community participation. Characteristics of included studies were extracted for analysis. Risk of bias assessment was performed using Cochrane's ROB tool. Narrative synthesis of findings was undertaken and summary of findings in each outcome was reported using GRADEpro. Results: Four studies were included involving 84 stroke participants with chronic hemiparesis. Interventions intensity ranged (6-12 sessions, 20 minutes-1 hour/session). Three studies investigated the effects on gait speed and balance. 2 studies investigated functional outcomes and one study assessed community participation. ROB assessment showed 50% unclear risk of selection bias and 25% of unclear risk of detection bias across the studies. Heterogeneity was identified in the intervention effects at post training and follow up. Outcome measures, training intensity and durations also varied across the studies, grade of evidence was low for balance, moderate for speed and function outcomes, and high for community participation. However, it is important to note that grading was done on few numbers of studies in each outcome. Conclusions: The summary of findings suggests positive and statistically significant effects (p<0.05) of VR treadmill training compared to other interventions on gait speed, dynamic balance skills, function and participation directly after training. However, the effects were not sustained at follow up in two studies (2 weeks-1 month) and other studies did not perform follow up measurements. More RCTs with larger sample sizes and higher methodological quality are required to examine the long term effects of VR treadmill effects on function independence and community participation after stroke, in order to draw conclusions and produce stronger robust evidence.

Keywords: virtual reality, treadmill, stroke, gait rehabilitation

Procedia PDF Downloads 266