Search results for: recovery time
15959 Core Number Optimization Based Scheduler to Order/Mapp Simulink Application
Authors: Asma Rebaya, Imen Amari, Kaouther Gasmi, Salem Hasnaoui
Abstract:
Over these last years, the number of cores witnessed a spectacular increase in digital signal and general use processors. Concurrently, significant researches are done to get benefit from the high degree of parallelism. Indeed, these researches are focused to provide an efficient scheduling from hardware/software systems to multicores architecture. The scheduling process consists on statically choose one core to execute one task and to specify an execution order for the application tasks. In this paper, we describe an efficient scheduler that calculates the optimal number of cores required to schedule an application, gives a heuristic scheduling solution and evaluates its cost. Our proposal results are evaluated and compared with Preesm scheduler results and we prove that ours allows better scheduling in terms of latency, computation time and number of cores.Keywords: computation time, hardware/software system, latency, optimization, multi-cores platform, scheduling
Procedia PDF Downloads 28515958 Effectiveness of Multi-Business Core Development Policy in Tokyo Metropolitan Area
Authors: Takashi Nakamura
Abstract:
In the Tokyo metropolitan area, traffic congestion and long commute times are caused by overconcentration in the central area. To resolve these problems, a core business city development policy was adopted in 1988. The core business cities, which include Yokohama, Chiba, Saitama, Tachikawa, and others, have designated business facilities accumulation districts where assistance measures are applied. Focusing on Yokohama city, this study investigates the trends in the number of offices, employees, and commuters at 2001 and 2012 Economic Census, as well as the average commute time in the Tokyo metropolitan area from 2005 to 2015 Metropolitan Transportation Census. Surveys were administered in 2001 and 2012 Economic Census to participants who worked in Yokohama, according to their distribution in the city's 1,757 subregions. Four main findings emerged: (1) The number of offices increased in Yokohama when the number of offices decreased in the Tokyo metropolitan area overall. Additionally, the number of employees at Yokohama increased. (2) The number of commuters to Tokyo's central area increased from Saitama prefecture, Tokyo Tama area, and Tokyo central area. However, it decreased from other areas. (3) The average commute time in the Tokyo metropolitan area was 67.7 minutes in 2015, a slight decrease from 2005 and 2010. (4) The number of employees at business facilities accumulation districts in Yokohama city increased greatly.Keywords: core business city development policy, commute time, number of employees, Yokohama city, distribution of employees
Procedia PDF Downloads 14815957 Comparison of Different DNA Extraction Platforms with FFPE tissue
Authors: Wang Yanping Karen, Mohd Rafeah Siti, Park MI Kyoung
Abstract:
Formalin-fixed paraffin embedded (FFPE) tissue is important in the area of oncological diagnostics. This method of preserving tissues enabling them to be stored easily at ambient temperature for a long time. This decreases the risk of losing the DNA quantity and quality after extraction, reducing sample wastage, and making FFPE more cost effective. However, extracting DNA from FFPE tissue is a challenge as DNA purified is often highly cross-linked, fragmented, and degraded. In addition, this causes problems for many downstream processes. In this study, there will be a comparison of DNA extraction efficiency between One BioMed’s Xceler8 automated platform with commercial available extraction kits (Qiagen and Roche). The FFPE tissue slices were subjected to deparaffinization process, pretreatment and then DNA extraction using the three mentioned platforms. The DNA quantity were determined with real-time PCR (BioRad CFX ) and gel electrophoresis. The amount of DNA extracted with the One BioMed’s X8 platform was found to be comparable with the other two manual extraction kits.Keywords: DNA extraction, FFPE tissue, qiagen, roche, one biomed X8
Procedia PDF Downloads 11015956 Efficiency of Treatment in Patients with Newly Diagnosed Destructive Pulmonary Tuberculosis Using Intravenous Chemotherapy
Authors: M. Kuzhko, M. Gumeniuk, D. Butov, T. Tlustova, O. Denysov, T. Sprynsian
Abstract:
Background: The aim of the research was to determine the effectiveness of chemotherapy using intravenous antituberculosis drugs compared with their oral administration during the intensive phase of treatment. Methods: 152 tuberculosis patients were randomized into 2 groups: Main (n=65) who received isoniazid, ethambutol and sodium rifamycin intravenous + pyrazinamide per os and control (n=87) who received all the drugs (isoniazid, rifampicin, ethambutol, pyrazinamide) orally. Results: After 2 weeks of treatment symptoms of intoxication disappeared in 59 (90.7±3.59 %) of patients of the main group and 60 (68.9±4.9 %) patients in the control group, p<0.05. The mean duration of symptoms of intoxication in patients main group was 9.6±0.7 days, in control group – 13.7±0.9 days. After completing intensive phase sputum conversion was found in all the patients main group and 71 (81.6±4.1 %) patients control group p < 0.05. The average time of sputum conversion in main group was 1.6±0.1 months and 1.9±0.1 months in control group, p > 0.05. In patients with destructive pulmonary tuberculosis time to sputum conversion was 1.7±0.1 months in main group and 2.2±0.2 months in control group, p < 0.05. The average time of cavities healing in main group was 2.9±0.2 months and 3.9±0.2 months in the control group, p < 0.05. Conclusions: In patients with newly diagnosed destructive pulmonary tuberculosis use of isoniazid, ethambutol and sodium rifamycin intravenous in the intensive phase of chemotherapy resulted in a significant reduction in terms of the disappearance of symptoms of intoxication and sputum conversion.Keywords: intravenous chemotherapy, tuberculosis, treatment efficiency, tuberculosis drugs
Procedia PDF Downloads 20415955 Accurate Calculation of the Penetration Depth of a Bullet Using ANSYS
Authors: Eunsu Jang, Kang Park
Abstract:
In developing an armored ground combat vehicle (AGCV), it is a very important step to analyze the vulnerability (or the survivability) of the AGCV against enemy’s attack. In the vulnerability analysis, the penetration equations are usually used to get the penetration depth and check whether a bullet can penetrate the armor of the AGCV, which causes the damage of internal components or crews. The penetration equations are derived from penetration experiments which require long time and great efforts. However, they usually hold only for the specific material of the target and the specific type of the bullet used in experiments. Thus, penetration simulation using ANSYS can be another option to calculate penetration depth. However, it is very important to model the targets and select the input parameters in order to get an accurate penetration depth. This paper performed a sensitivity analysis of input parameters of ANSYS on the accuracy of the calculated penetration depth. Two conflicting objectives need to be achieved in adopting ANSYS in penetration analysis: maximizing the accuracy of calculation and minimizing the calculation time. To maximize the calculation accuracy, the sensitivity analysis of the input parameters for ANSYS was performed and calculated the RMS error with the experimental data. The input parameters include mesh size, boundary condition, material properties, target diameter are tested and selected to minimize the error between the calculated result from simulation and the experiment data from the papers on the penetration equation. To minimize the calculation time, the parameter values obtained from accuracy analysis are adjusted to get optimized overall performance. As result of analysis, the followings were found: 1) As the mesh size gradually decreases from 0.9 mm to 0.5 mm, both the penetration depth and calculation time increase. 2) As diameters of the target decrease from 250mm to 60 mm, both the penetration depth and calculation time decrease. 3) As the yield stress which is one of the material property of the target decreases, the penetration depth increases. 4) The boundary condition with the fixed side surface of the target gives more penetration depth than that with the fixed side and rear surfaces. By using above finding, the input parameters can be tuned to minimize the error between simulation and experiments. By using simulation tool, ANSYS, with delicately tuned input parameters, penetration analysis can be done on computer without actual experiments. The data of penetration experiments are usually hard to get because of security reasons and only published papers provide them in the limited target material. The next step of this research is to generalize this approach to anticipate the penetration depth by interpolating the known penetration experiments. This result may not be accurate enough to be used to replace the penetration experiments, but those simulations can be used in the early stage of the design process of AGCV in modelling and simulation stage.Keywords: ANSYS, input parameters, penetration depth, sensitivity analysis
Procedia PDF Downloads 40415954 Thermoelectric Properties of Doped Polycrystalline Silicon Film
Authors: Li Long, Thomas Ortlepp
Abstract:
The transport properties of carriers in polycrystalline silicon film affect the performance of polycrystalline silicon-based devices. They depend strongly on the grain structure, grain boundary trap properties and doping concentration, which in turn are determined by the film deposition and processing conditions. Based on the properties of charge carriers, phonons, grain boundaries and their interactions, the thermoelectric properties of polycrystalline silicon are analyzed with the relaxation time approximation of the Boltz- mann transport equation. With this approach, thermal conductivity, electrical conductivity and Seebeck coefficient as a function of grain size, trap properties and doping concentration can be determined. Experiment on heavily doped polycrystalline silicon is carried out and measurement results are compared with the model.Keywords: conductivity, polycrystalline silicon, relaxation time approximation, Seebeck coefficient, thermoelectric property
Procedia PDF Downloads 12615953 Detection of Voltage Sag and Voltage Swell in Power Quality Using Wavelet Transforms
Authors: Nor Asrina Binti Ramlee
Abstract:
Voltage sag, voltage swell, high-frequency noise and voltage transients are kinds of disturbances in power quality. They are also known as power quality events. Equipment used in the industry nowadays has become more sensitive to these events with the increasing complexity of equipment. This leads to the importance of distributing clean power quality to the consumer. To provide better service, the best analysis on power quality is very vital. Thus, this paper presents the events detection focusing on voltage sag and swell. The method is developed by applying time domain signal analysis using wavelet transform approach in MATLAB. Four types of mother wavelet namely Haar, Dmey, Daubechies, and Symlet are used to detect the events. This project analyzed real interrupted signal obtained from 22 kV transmission line in Skudai, Johor Bahru, Malaysia. The signals will be decomposed through the wavelet mothers. The best mother is the one that is capable to detect the time location of the event accurately.Keywords: power quality, voltage sag, voltage swell, wavelet transform
Procedia PDF Downloads 37515952 A Portable Device for Pulse Wave Velocity Measurements
Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen
Abstract:
Pulse wave velocity (PWV) of blood flow provides important information of vessel property and blood pressure which can be used to assess cardiovascular disease. However, the above measurements need expensive equipment, such as Doppler ultrasound, MRI, angiography etc. The photoplethysmograph (PPG) signals are commonly utilized to detect blood volume changes. In this study, two infrared (IR) probes are designed and placed at a fixed distance from finger base and fingertip. An analog circuit with automatic gain adjustment is implemented to get the stable original PPG signals from above two IR probes. In order to obtain the time delay precisely between two PPG signals, we obtain the pulse transit time from the second derivative of the original PPG signals. To get a portable, wireless and low power consumption PWV measurement device, the low energy Bluetooth 4.0 (BLE) and the microprocessor (Cortex™-M3) are used in this study. The PWV is highly correlated with blood pressure. This portable device has potential to be used for continuous blood pressure monitoring.Keywords: pulse wave velocity, photoplethysmography, portable device, biomedical engineering
Procedia PDF Downloads 52915951 Analysing Environmental Licensing of Infrastructure Projects in Brazil
Authors: Ronaldo Seroa Da Motta, Gabriela Santiago
Abstract:
The main contribution of this study is the identification of the factors influencing the environmental licensing process of infrastructure projects in Brazil. These factors will be those that reflect the technical characteristics of the project, the corporate governance of the entrepreneur, and the institutional and regulatory governance of the environmental agency, including the number of interventions by non-licensing agencies. The model conditions these variables to the licensing processing time of 34 infrastructure projects. Our results indicated that the conditions would be more sensitive to the type of enterprise, complexity as in gas pipelines and hydroelectric plants in the most vulnerable biome with a greater value of the enterprise or the entrepreneur's assets, together with the number of employees of the licensing agency. The number of external interventions by other non-licensing institutions does not affect the licensing time. Such results challenge the current criticism that environmental licensing has been often pointed out as a barrier to speed up investments in infrastructure projects in Brazil due to the participation of civil society and other non-licensing institutions.Keywords: environmental licensing, condionants, Brazil, timing process
Procedia PDF Downloads 13715950 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing
Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed
Abstract:
Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.Keywords: cognitive radio, energy detector, periodogram, spectrum sensing
Procedia PDF Downloads 38115949 The Effect of Second Victim-Related Distress on Work-Related Outcomes in Tertiary Care, Kelantan, Malaysia
Authors: Ahmad Zulfahmi Mohd Kamaruzaman, Mohd Ismail Ibrahim, Ariffin Marzuki Mokhtar, Maizun Mohd Zain, Saiful Nazri Satiman, Mohd Najib Majdi Yaacob
Abstract:
Background: Aftermath any patient safety incidents, the involved healthcare providers possibly sustained second victim-related distress (second victim distress and reduced their professional efficacy), with subsequent negative work-related outcomes or vice versa cultivating resilience. This study aimed to investigate the factors affecting negative work-related outcomes and resilience, with the triad of support; colleague, supervisor, and institutional support as the hypothetical mediators. Methods: This was a cross sectional study recruiting a total of 733 healthcare providers from three tertiary care in Kelantan, Malaysia. Three steps of hierarchical linear regression were developed for each outcome; negative work-related outcomes and resilience. Then, four multiple mediator models of support triad were analyzed. Results: Second victim distress, professional efficacy, and the support triad contributed significantly for each regression model. In the pathway of professional efficacy on each negative work-related outcomes and resilience, colleague support partially mediated the relationship. As for second victim distress on negative work related outcomes, colleague and supervisor support were the partial mediator, and on resilience; all support triad also produced a similar effect. Conclusion: Second victim distress, professional efficacy, and the support triad influenced the relationship with the negative work-related outcomes and resilience. Support triad as the mediators ameliorated the effect in between and explained the urgency of having good support for recovery post encountering patient safety incidents.Keywords: second victims, patient safety incidents, hierarchical linear regression, mediation, support
Procedia PDF Downloads 11215948 Networked Radar System to Increase Safety of Urban Railroad Crossing
Authors: Sergio Saponara, Luca Fanucci, Riccardo Cassettari, Ruggero Piernicola, Marco Righetto
Abstract:
The paper presents an innovative networked radar system for detection of obstacles in a railway level crossing scenario. This Monitoring System (MS) is able to detect moving or still obstacles within the railway level crossing area automatically, avoiding the need of human presence for surveillance. The MS is also connected to the National Railway Information and Signaling System to communicate in real-time the level crossing status. The architecture is compliant with the highest Safety Integrity Level (SIL4) of the CENELEC standard. The number of radar sensors used is configurable at set-up time and depends on how large the level crossing area can be. At least two sensors are expected and up four can be used for larger areas. The whole processing chain that elaborates the output sensor signals, as well as the communication interface, is fully-digital, was designed in VHDL code and implemented onto a Xilinx Virtex 6.Keywords: radar for safe mobility, railroad crossing, railway, transport safety
Procedia PDF Downloads 48515947 Parking Space Detection and Trajectory Tracking Control for Vehicle Auto-Parking
Authors: Shiuh-Jer Huang, Yu-Sheng Hsu
Abstract:
On-board available parking space detecting system, parking trajectory planning and tracking control mechanism are the key components of vehicle backward auto-parking system. Firstly, pair of ultrasonic sensors is installed on each side of vehicle body surface to detect the relative distance between ego-car and surrounding obstacle. The dimension of a found empty space can be calculated based on vehicle speed and the time history of ultrasonic sensor detecting information. This result can be used for constructing the 2D vehicle environmental map and available parking type judgment. Finally, the auto-parking controller executes the on-line optimal parking trajectory planning based on this 2D environmental map, and monitors the real-time vehicle parking trajectory tracking control. This low cost auto-parking system was tested on a model car.Keywords: vehicle auto-parking, parking space detection, parking path tracking control, intelligent fuzzy controller
Procedia PDF Downloads 24615946 Calculating Asphaltenes Precipitation Onset Pressure by Using Cardanol as Precipitation Inhibitor: A Strategy to Increment the Oil Well Production
Authors: Camilo A. Guerrero-Martin, Erik Montes Paez, Marcia C. K. Oliveira, Jonathan Campos, Elizabete F. Lucas
Abstract:
Asphaltenes precipitation is considered as a formation damage problem, which can reduce the oil recovery factor. It fouls piping and surface installations, as well as cause serious flow assurance complications and decline oil well production. Therefore, researchers have shown an interest in chemical treatments to control this phenomenon. The aim of this paper is to assess the asphaltenes precipitation onset of crude oils in the presence of cardanol, by titrating the crude with n-heptane. Moreover, based on this results obtained at atmosphere pressure, the asphaltenes precipitation onset pressure were calculated to predict asphaltenes precipitation in the reservoir, by using differential liberation and refractive index data of the oils. The influence of cardanol concentrations in the asphaltenes stabilization of three Brazilian crude oils samples (with similar API densities) was studied. Therefore, four formulations of cardanol in toluene were prepared: 0, 3, 5, 10 and 15 m/m%. The formulations were added to the crude at 2:98 ratio. The petroleum samples were characterized by API density, elemental analysis and differential liberation test. The asphaltenes precipitation onset (APO) was determined by titrating with n-heptane and monitoring with near-infrared (NIR). UV-Vis spectroscopy experiments were also done to assess the precipitate asphaltenes content. The asphaltenes precipitation envelopes (APE) were also determined by numerical simulation (Multiflash). In addition, the adequate artificial lift systems (ALS) for the oils were selected. It was based on the downhole well profile and a screening methodology. Finally, the oil flowrates were modelling by NODAL analysis production system in the PIPESIM software. The results of this study show that the asphaltenes precipitation onset of the crude oils were 2.2, 2.3 and 6.0 mL of n-heptane/g of oil. The cardanol was an effective inhibitor of asphaltenes precipitation for the crude oils used in this study, since it displaces the precipitation pressure of the oil to lower values. This indicates that cardanol can increase the oil wells productivity.Keywords: asphaltenes, NODAL analysis production system, precipitation pressure onset, inhibitory molecule
Procedia PDF Downloads 17815945 A High Performance Piano Note Recognition Scheme via Precise Onset Detection and Segmented Short-Time Fourier Transform
Authors: Sonali Banrjee, Swarup Kumar Mitra, Aritra Acharyya
Abstract:
A piano note recognition method has been proposed by the authors in this paper. The authors have used a comprehensive method for onset detection of each note present in a piano piece followed by segmented short-time Fourier transform (STFT) for the identification of piano notes. The performance evaluation of the proposed method has been carried out in different harsh noisy environments by adding different levels of additive white Gaussian noise (AWGN) having different signal-to-noise ratio (SNR) in the original signal and evaluating the note detection error rate (NDER) of different piano pieces consisting of different number of notes at different SNR levels. The NDER is found to be remained within 15% for all piano pieces under consideration when the SNR is kept above 8 dB.Keywords: AWGN, onset detection, piano note, STFT
Procedia PDF Downloads 16115944 Effects of Virtual Reality on the Upper Extremity Spasticity and Motor Function in Patients with Stroke: A Single Blinded Randomized Controlled Trial
Authors: Kasra Afsahi, Maryam Soheilifar, S. Hossein Hosseini, Omid Seyed Esmaeili, Rouzbeh Kezemi, Noushin Mehrbod, Nazanin Vahed, Tahereh Hajiahmad, Noureddin Nakhostin Ansari
Abstract:
Background: Stroke is a disabling neurological disease. Rehabilitative therapies are important treatment methods. This clinical trial was done to compare the effects of VR beside conventional rehabilitation versus conventional rehabilitation alone on spasticity and motor function in stroke patients. Materials and Methods: In this open-label randomized controlled clinical trial, 40 consecutive patients with stable first-ever ischemic stroke in the past three to 12 months that were referred to a rehabilitation clinic in Tehran, Iran, in 2020 were enrolled. After signing the informed written consent form, subjects were randomly assigned by block randomization of five in each block as cases with 1:1 into two groups of 20 cases; conventional plus VR therapy group: 45-minute conventional therapy session plus 15-minute VR therapy, and conventional group: 60-minute conventional therapy session. VR rehabilitation is designed and developed with different stages. Outcomes were modified Ashworth scale, recovery stage score for motor function, range of motion (ROM) of shoulder abduction/wrist extension, and patients’ satisfaction rate. Data were compared after study termination. Results: The satisfaction rate among the patients was significantly better in the combination group (P=0.003). Only wrist extension was varied between groups and was better in the combination group. The variables generally had a statistically significant difference (P < 0.05). Conclusion: Virtual reality plus conventional rehabilitation therapy is superior versus conventional rehabilitation alone on the wrist and elbow spasticity and motor function in patients with stroke.Keywords: stroke, virtual therapy, rehabilitation, treatment
Procedia PDF Downloads 23615943 Using Multi-Specialist Team to Care for a Breast Cancer Patient Who Received Total Mastectomy during Pregnancy
Authors: Yun-Tsuen Chen, Shih-Ting Huang, Pi-Fen Cheng, Heng-Hua Wang, Hui-Zhu Chen
Abstract:
This paper discusses the experience of caring for a patient diagnosed with breast cancer and later received total mastectomy during a 2nd trimester pregnancy. She was hospitalized from January 31 to February 4, 2018. Using 'Gordon’s 11 Functional Health Patterns' through physical exams and interviews, the researcher assessed the patient’s physical and mental health and determined the patient to have anxiety, acute pain, and body image disturbance. After establishing a strong relationship with the patient, the researcher helped the patient express her anxiety and personal feelings. A multi-specialist team was formed to evaluate both the patient and her unborn child, before, during, and after surgery. This individualized care allowed the patient and her child to optimize the post-operative results. Aside from medication, the patient also received non-medicinal treatment, including improvement of sleep quality with body positioning, diaphragmatic breathing exercises for pain and stress relief after surgery. Throughout hospitalization, the patient’s physical and emotional needs were addressed daily with listening sessions and empathy. The patient’s husband was also incorporated in the patient’s recovery by teaching both he and the patient how to change the sterile wound dressing, which may have the added benefit of improving marital relationships through shared activities of nurturing. The patient was also given advice about how to improve self-confidence through clothing. Lastly, the patient was encouraged to join a support group for breast cancer patients. Through the sharing of experience in groups and within the family, the patient was helped to adapt to the change of her appearance and re-establish her self-confidence. This level of care expedited the patient’s return to her family life and role of being a mother.Keywords: anxiety, body image disturbance, breast cancer during pregnancy, multi-specialist team
Procedia PDF Downloads 10115942 Portable Cardiac Monitoring System Based on Real-Time Microcontroller and Multiple Communication Interfaces
Authors: Ionel Zagan, Vasile Gheorghita Gaitan, Adrian Brezulianu
Abstract:
This paper presents the contributions in designing a mobile system named Tele-ECG implemented for remote monitoring of cardiac patients. For a better flexibility of this application, the authors chose to implement a local memory and multiple communication interfaces. The project described in this presentation is based on the ARM Cortex M0+ microcontroller and the ADAS1000 dedicated chip necessary for the collection and transmission of Electrocardiogram signals (ECG) from the patient to the microcontroller, without altering the performances and the stability of the system. The novelty brought by this paper is the implementation of a remote monitoring system for cardiac patients, having a real-time behavior and multiple interfaces. The microcontroller is responsible for processing digital signals corresponding to ECG and also for the implementation of communication interface with the main server, using GSM/Bluetooth SIMCOM SIM800C module. This paper translates all the characteristics of the Tele-ECG project representing a feasible implementation in the biomedical field. Acknowledgment: This paper was supported by the project 'Development and integration of a mobile tele-electrocardiograph in the GreenCARDIO© system for patients monitoring and diagnosis - m-GreenCARDIO', Contract no. BG58/30.09.2016, PNCDI III, Bridge Grant 2016, using the infrastructure from the project 'Integrated Center for research, development and innovation in Advanced Materials, Nanotechnologies, and Distributed Systems for fabrication and control', Contract No. 671/09.04.2015, Sectoral Operational Program for Increase of the Economic Competitiveness co-funded from the European Regional Development Fund.Keywords: Tele-ECG, real-time cardiac monitoring, electrocardiogram, microcontroller
Procedia PDF Downloads 27415941 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices
Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays
Abstract:
Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.Keywords: ecological momentary assessment, real-time, stress, work
Procedia PDF Downloads 16315940 Evaluation of Mixing and Oxygen Transfer Performances for a Stirred Bioreactor Containing P. chrysogenum Broths
Authors: A. C. Blaga, A. Cârlescu, M. Turnea, A. I. Galaction, D. Caşcaval
Abstract:
The performance of an aerobic stirred bioreactor for fungal fermentation was analyzed on the basis of mixing time and oxygen mass transfer coefficient, by quantifying the influence of some specific geometrical and operational parameters of the bioreactor, as well as the rheological behavior of Penicillium chrysogenum broth (free mycelia and mycelia aggregates). The rheological properties of the fungus broth, controlled by the biomass concentration, its growth rate, and morphology strongly affect the performance of the bioreactor. Experimental data showed that for both morphological structures the accumulation of fungus biomass induces a significant increase of broths viscosity and modifies the rheological behavior. For lower P. chrysogenum concentrations (both morphological conformations), the mixing time initially increases with aeration rate, reaches a maximum value and decreases. This variation can be explained by the formation of small bubbles, due to the presence of solid phase which hinders the bubbles coalescence, the rising velocity of bubbles being reduced by the high apparent viscosity of fungus broths. By biomass accumulation, the variation of mixing time with aeration rate is gradually changed, the continuous reduction of mixing time with air input flow increase being obtained for 33.5 g/l d.w. P. chrysogenum. Owing to the superior apparent viscosity, which reduces considerably the relative contribution of mechanical agitation to the broths mixing, these phenomena are more pronounced for P. chrysogenum free mycelia. Due to the increase of broth apparent viscosity, the biomass accumulation induces two significant effects on oxygen transfer rate: the diminution of turbulence and perturbation of bubbles dispersion - coalescence equilibrium. The increase of P. chrysogenum free mycelia concentration leads to the decrease of kla values. Thus, for the considered variation domain of the main parameters taken into account, namely air superficial velocity from 8.36 10-4 to 5.02 10-3 m/s and specific power input from 100 to 500 W/m3, kla was reduced for 3.7 times for biomass concentration increase from 4 to 36.5 g/l d.w. The broth containing P. crysogenum mycelia aggregates exhibits a particular behavior from the point of view of oxygen transfer. Regardless of bioreactor operating conditions, the increase of biomass concentration leads initially to the increase of oxygen mass transfer rate, the phenomenon that can be explained by the interaction of pellets with bubbles. The results are in relation with the increase of apparent viscosity of broths corresponding to the variation of biomass concentration between the mentioned limits. Thus, the apparent viscosity of the suspension of fungus mycelia aggregates increased for 44.2 times and fungus free mycelia for 63.9 times for CX increase from 4 to 36.5 g/l d.w. By means of the experimental data, some mathematical correlations describing the influences of the considered factors on mixing time and kla have been proposed. The proposed correlations can be used in bioreactor performance evaluation, optimization, and scaling-up.Keywords: biomass concentration, mixing time, oxygen mass transfer, P. chrysogenum broth, stirred bioreactor
Procedia PDF Downloads 34215939 A Physiological Approach for Early Detection of Hemorrhage
Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain
Abstract:
Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning
Procedia PDF Downloads 16815938 A Parallel Algorithm for Solving the PFSP on the Grid
Authors: Samia Kouki
Abstract:
Solving NP-hard combinatorial optimization problems by exact search methods, such as Branch-and-Bound, may degenerate to complete enumeration. For that reason, exact approaches limit us to solve only small or moderate size problem instances, due to the exponential increase in CPU time when problem size increases. One of the most promising ways to reduce significantly the computational burden of sequential versions of Branch-and-Bound is to design parallel versions of these algorithms which employ several processors. This paper describes a parallel Branch-and-Bound algorithm called GALB for solving the classical permutation flowshop scheduling problem as well as its implementation on a Grid computing infrastructure. The experimental study of our distributed parallel algorithm gives promising results and shows clearly the benefit of the parallel paradigm to solve large-scale instances in moderate CPU time.Keywords: grid computing, permutation flow shop problem, branch and bound, load balancing
Procedia PDF Downloads 28315937 Research Project of National Interest (PRIN-PNRR) DIVAS: Developing Methods to Assess Tree Vitality after a Wildfire through Analyses of Cambium Sugar Metabolism
Authors: Claudia Cocozza, Niccolò Frassinelli, Enrico Marchi, Cristiano Foderi, Alessandro Bizzarri, Margherita Paladini, Maria Laura Traversi, Eleftherious Touloupakis, Alessio Giovannelli
Abstract:
The development of tools to quickly identify the fate of injured trees after stress is highly relevant when biodiversity restoration of damaged sites is based on nature-based solutions. In this context, an approach to assess irreversible physiological damages within trees could help to support planning management decisions of perturbed sites to restore biodiversity, for the safety of the environment and understanding functionality adjustments of the ecosystems. Tree vitality can be estimated by a series of physiological proxies like cambium activity, starch, and soluble sugars amount in C-sinks whilst the accumulation of ethanol within the cambial cells and phloem is considered an alert of cell death. However, their determination requires time-consuming laboratory protocols, which makes the approach unfeasible as a practical option in the field. The project aims to develop biosensors to assess the concentration of soluble sugars and ethanol in stem tissues. Soluble sugars and ethanol concentrations will be used to define injured trees to discriminate compromised and recovering trees in the forest directly. To reach this goal, we select study sites subjected to prescribed fires or recent wildfires as experimental set-ups. Indeed, in Mediterranean countries, forest fire is a recurrent event that must be considered as a central component of regional and global strategies in forest management and biodiversity restoration programs. A biosensor will be developed through a multistep process related to target analytes characterization, bioreceptor selection, and, finally, calibration/testing of the sensor. To validate biosensor signals, soluble sugars and ethanol will be quantified by HPLC and GC using synthetic media (in lab) and phloem sap (in field) whilst cambium vitality will be assessed by anatomical observations. On burnt trees, the stem growth will be monitored by dendrometers and/or estimated by tree ring analyses, whilst the tree response to past fire events will be assessed by isotopic discrimination. Moreover, the fire characterization and the visual assessment procedure will be used to assign burnt trees to a vitality class. At the end of the project, a well-defined procedure combining biosensor signal and visual assessment will be produced and applied to a study case. The project outcomes and the results obtained will be properly packaged to reach, engage and address the needs of the final users and widely shared with relevant stakeholders involved in the optimal use of biosensors and in the management of post-fire areas. This project was funded by National Recovery and Resilience Plan (NRRP), Mission 4, Component C2, Investment 1.1 - Call for tender No. 1409 of 14 September 2022 – ‘Progetti di Ricerca di Rilevante interesse Nazionale – PRIN’ of Italian Ministry of University and Research funded by the European Union – NextGenerationEU; Grant N° P2022Z5742, CUP B53D23023780001.Keywords: phloem, scorched crown, conifers, prescribed burning, biosensors
Procedia PDF Downloads 2015936 Adsorption of Xylene Cyanol FF onto Activated Carbon from Brachystegia Eurycoma Seed Hulls: Determination of the Optimal Conditions by Statistical Design of Experiments
Authors: F. G Okibe, C. E Gimba, V. O Ajibola, I. G Ndukwe, E. D. Paul
Abstract:
A full factorial experimental design technique at two levels and four factors (24) was used to optimize the adsorption at 615 nm of Xylene Cyanol ff in aqueous solutions onto activated carbon prepared from brachystegia eurycoma seed hulls by chemical carbonization method. The effect of pH (3 and 5), initial dye concentration (20 and 60 mg/l), adsorbent dosage (0.01 and 0.05 g), and contact time (30 and 60 min) on removal efficiency of the adsorbent for the dye were investigated at 298K. From the analysis of variance, response surface and cube plot, adsorbent dosage was observed to be the most significant factor affecting the adsorption process. However, from the interaction between the variables studied, the optimum removal efficiency was 96.80 % achieved with adsorbent dosage of 0.05 g, contact time 45 minutes, pH 3, and initial dye concentration 60 mg/l.Keywords: factorial experimental design, adsorption, optimization, brachystegia eurycoma, xylene cyanol ff
Procedia PDF Downloads 40315935 Environmental Assessment of Roll-to-Roll Printed Smart Label
Authors: M. Torres, A. Moulay, M. Zhuldybina, M. Rozel, N. D. Trinh, C. Bois
Abstract:
Printed electronics are a fast-growing market as their applications cover a large range of industrial needs, their production cost is low, and the additive printing techniques consume less materials than subtractive manufacturing methods used in traditional electronics. With the growing demand for printed electronics, there are concerns about their harmful and irreversible contribution to the environment. Indeed, it is estimated that 80% of the environmental load of a product is determined by the choices made at the conception stage. Therefore, examination through a life cycle approach at the developing stage of a novel product is the best way to identify potential environmental issues and make proactive decisions. Life cycle analysis (LCA) is a comprehensive scientific method to assess the environmental impacts of a product in its different stages of life: extraction of raw materials, manufacture and distribution, use, and end-of-life. Impacts and major hotspots are identified and evaluated through a broad range of environmental impact categories of the ReCiPe (H) middle point method. At the conception stage, the LCA is a tool that provides an environmental point of view on the choice of materials and processes and weights-in on the balance between performance materials and eco-friendly materials. Using the life cycle approach, the current work aims to provide a cradle-to-grave life cycle assessment of a roll-to-roll hybrid printed smart label designed for the food cold chain. Furthermore, this presentation will present the environmental impact of metallic conductive inks, a comparison with promising conductive polymers, evaluation of energy vs. performance of industrial printing processes, a full assessment of the impact from the smart label applied on a cellulosic-based substrate during the recycling process and the possible recovery of precious metals and rare earth elements.Keywords: Eco-design, label, life cycle assessment, printed electronics
Procedia PDF Downloads 16515934 Challenge Response-Based Authentication for a Mobile Voting System
Authors: Tohari Ahmad, Hudan Studiawan, Iwang Aryadinata, Royyana M. Ijtihadie, Waskitho Wibisono
Abstract:
A manual voting system has been implemented worldwide. It has some weaknesses which may decrease the legitimacy of the voting result. An electronic voting system is introduced to minimize this weakness. It has been able to provide a better result, in terms of the total time taken in the voting process and accuracy. Nevertheless, people may be reluctant to go to the polling location because of some reasons, such as distance and time. In order to solve this problem, mobile voting is implemented by utilizing mobile devices. There are many mobile voting architectures available. Overall, authenticity of the users is the common problem of all voting systems. There must be a mechanism which can verify the users’ authenticity such that only verified users can give their vote once; others cannot vote. In this paper, a challenge response-based authentication is proposed by utilizing properties of the users, for example, something they have and know. In terms of speed, the proposed system provides good result, in addition to other capabilities offered by the system.Keywords: authentication, data protection, mobile voting, security
Procedia PDF Downloads 42115933 Experimental Study on Two-Step Pyrolysis of Automotive Shredder Residue
Authors: Letizia Marchetti, Federica Annunzi, Federico Fiorini, Cristiano Nicolella
Abstract:
Automotive shredder residue (ASR) is a mixture of waste that makes up 20-25% of end-of-life vehicles. For many years, ASR was commonly disposed of in landfills or incinerated, causing serious environmental problems. Nowadays, thermochemical treatments are a promising alternative, although the heterogeneity of ASR still poses some challenges. One of the emerging thermochemical treatments for ASR is pyrolysis, which promotes the decomposition of long polymeric chains by providing heat in the absence of an oxidizing agent. In this way, pyrolysis promotes the conversion of ASR into solid, liquid, and gaseous phases. This work aims to improve the performance of a two-step pyrolysis process. After the characterization of the analysed ASR, the focus is on determining the effects of residence time on product yields and gas composition. A batch experimental setup that reproduces the entire process was used. The setup consists of three sections: the pyrolysis section (made of two reactors), the separation section, and the analysis section. Two different residence times were investigated to find suitable conditions for the first sample of ASR. These first tests showed that the products obtained were more sensitive to residence time in the second reactor. Indeed, slightly increasing residence time in the second reactor managed to raise the yield of gas and carbon residue and decrease the yield of liquid fraction. Then, to test the versatility of the setup, the same conditions were applied to a different sample of ASR coming from a different chemical plant. The comparison between the two ASR samples shows that similar product yields and compositions are obtained using the same setup.Keywords: automotive shredder residue, experimental tests, heterogeneity, product yields, two-step pyrolysis
Procedia PDF Downloads 13115932 Creating Risk Maps on the Spatiotemporal Occurrence of Agricultural Insecticides in Sub-Saharan Africa
Authors: Chantal Hendriks, Harry Gibson, Anna Trett, Penny Hancock, Catherine Moyes
Abstract:
The use of modern inputs for crop protection, such as insecticides, is strongly underestimated in Sub-Saharan Africa. Several studies measured toxic concentrations of insecticides in fruits, vegetables and fish that were cultivated in Sub-Saharan Africa. The use of agricultural insecticides has impact on human and environmental health, but it also has the potential to impact on insecticide resistance in malaria transmitting mosquitos. To analyse associations between historic use of agricultural insecticides and the distribution of insecticide resistance through space and time, the use and environmental fate of agricultural insecticides needs to be mapped through the same time period. However, data on the use and environmental fate of agricultural insecticides in Africa are limited and therefore risk maps on the spatiotemporal occurrence of agricultural insecticides are created using environmental data. Environmental data on crop density and crop type were used to select the areas that most likely receive insecticides. These areas were verified by a literature review and expert knowledge. Pesticide fate models were compared to select most dominant processes that are involved in the environmental fate of insecticides and that can be mapped at a continental scale. The selected processes include: surface runoff, erosion, infiltration, volatilization and the storing and filtering capacity of soils. The processes indicate the risk for insecticide accumulation in soil, water, sediment and air. A compilation of all available data for traces of insecticides in the environment was used to validate the maps. The risk maps can result in space and time specific measures that reduce the risk of insecticide exposure to non-target organisms.Keywords: crop protection, pesticide fate, tropics, insecticide resistance
Procedia PDF Downloads 14515931 An Audit of Climate Change and Sustainability Teaching in Medical School
Authors: Karolina Wieczorek, Zofia Przypaśniak
Abstract:
Climate change is a rapidly growing threat to global health, and part of the responsibility to combat it lies within the healthcare sector itself, including adequate education of future medical professionals. To mitigate the consequences, the General Medical Council (GMC) has equipped medical schools with a list of outcomes regarding sustainability teaching. Students are expected to analyze the impact of the healthcare sector’s emissions on climate change. The delivery of the related teaching content is, however, often inadequate and insufficient time is devoted for exploration of the topics. Teaching curricula lack in-depth exploration of the learning objectives. This study aims to assess the extent and characteristics of climate change and sustainability subjects teaching in the curriculum of a chosen UK medical school (Barts and The London School of Medicine and Dentistry). It compares the data to the national average scores from the Climate Change and Sustainability Teaching (C.A.S.T.) in Medical Education Audit to draw conclusions about teaching on a regional level. This is a single-center audit of the timetabled sessions of teaching in the medical course. The study looked at the academic year 2020/2021 which included a review of all non-elective, core curriculum teaching materials including tutorials, lectures, written resources, and assignments in all five years of the undergraduate and graduate degrees, focusing only on mandatory teaching attended by all students (excluding elective modules). The topics covered were crosschecked with GMC Outcomes for graduates: “Educating for Sustainable Healthcare – Priority Learning Outcomes” as gold standard to look for coverage of the outcomes and gaps in teaching. Quantitative data was collected in form of time allocated for teaching as proxy of time spent per individual outcomes. The data was collected independently by two students (KW and ZP) who have received prior training and assessed two separate data sets to increase interrater reliability. In terms of coverage of learning outcomes, 12 out of 13 were taught (with the national average being 9.7). The school ranked sixth in the UK for time spent per topic and second in terms of overall coverage, meaning the school has a broad range of topics taught with some being explored in more detail than others. For the first outcome 4 out of 4 objectives covered (average 3.5) with 47 minutes spent per outcome (average 84 min), for the second objective 5 out of 5 covered (average 3.5) with 46 minutes spent (average 20), for the third 3 out of 4 (average 2.5) with 10 mins pent (average 19 min). A disproportionately large amount of time is spent delivering teaching regarding air pollution (respiratory illnesses), which resulted in the topic of sustainability in other specialties being excluded from teaching (musculoskeletal, ophthalmology, pediatrics, renal). Conclusions: Currently, there is no coherent strategy on national teaching of climate change topics and as a result an unstandardized amount of time spent on teaching and coverage of objectives can be observed.Keywords: audit, climate change, sustainability, education
Procedia PDF Downloads 8815930 Integrated Model for Enhancing Data Security Processing Time in Cloud Computing
Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali
Abstract:
Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a simple user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.Keywords: cloud computing, data security, SAAS, PAAS, IAAS, Blowfish
Procedia PDF Downloads 361