Search results for: turn-over time
16032 Optimization of Electrical Discharge Machining Parameters in Machining AISI D3 Tool Steel by Grey Relational Analysis
Authors: Othman Mohamed Altheni, Abdurrahman Abusaada
Abstract:
This study presents optimization of multiple performance characteristics [material removal rate (MRR), surface roughness (Ra), and overcut (OC)] of hardened AISI D3 tool steel in electrical discharge machining (EDM) using Taguchi method and Grey relational analysis. Machining process parameters selected were pulsed current Ip, pulse-on time Ton, pulse-off time Toff and gap voltage Vg. Based on ANOVA, pulse current is found to be the most significant factor affecting EDM process. Optimized process parameters are simultaneously leading to a higher MRR, lower Ra, and lower OC are then verified through a confirmation experiment. Validation experiment shows an improved MRR, Ra and OC when Taguchi method and grey relational analysis were usedKeywords: edm parameters, grey relational analysis, Taguchi method, ANOVA
Procedia PDF Downloads 29416031 Assessing Functional Structure in European Marine Ecosystems Using a Vector-Autoregressive Spatio-Temporal Model
Authors: Katyana A. Vert-Pre, James T. Thorson, Thomas Trancart, Eric Feunteun
Abstract:
In marine ecosystems, spatial and temporal species structure is an important component of ecosystems’ response to anthropological and environmental factors. Although spatial distribution patterns and fish temporal series of abundance have been studied in the past, little research has been allocated to the joint dynamic spatio-temporal functional patterns in marine ecosystems and their use in multispecies management and conservation. Each species represents a function to the ecosystem, and the distribution of these species might not be random. A heterogeneous functional distribution will lead to a more resilient ecosystem to external factors. Applying a Vector-Autoregressive Spatio-Temporal (VAST) model for count data, we estimate the spatio-temporal distribution, shift in time, and abundance of 140 species of the Eastern English Chanel, Bay of Biscay and Mediterranean Sea. From the model outputs, we determined spatio-temporal clusters, calculating p-values for hierarchical clustering via multiscale bootstrap resampling. Then, we designed a functional map given the defined cluster. We found that the species distribution within the ecosystem was not random. Indeed, species evolved in space and time in clusters. Moreover, these clusters remained similar over time deriving from the fact that species of a same cluster often shifted in sync, keeping the overall structure of the ecosystem similar overtime. Knowing the co-existing species within these clusters could help with predicting data-poor species distribution and abundance. Further analysis is being performed to assess the ecological functions represented in each cluster.Keywords: cluster distribution shift, European marine ecosystems, functional distribution, spatio-temporal model
Procedia PDF Downloads 19416030 Techniques of Construction Management in Civil Engineering
Authors: Mamoon M. Atout
Abstract:
The Middle East Gulf region has witnessed rapid growth and development in many areas over the last two decades. The development of the real-estate sector, construction industry and infrastructure projects are a major share of the development that has participated in the civilization of the countries of the Gulf. Construction industry projects were planned and managed by different types of experts, who came from all over the world having different types of experiences in construction management and industry. Some of these projects were completed on time, while many were not, due to many accumulating factors. Many accumulated factors are considered as the principle reason for the problem experienced at the project construction stage, which reflected negatively on the project success. Specific causes of delay have been identified by construction managers to avoid any unexpected delays through proper analysis and considerations to some implications such as risk assessment and analysis for many potential problems to ensure that projects will be delivered on time. Construction management implications were adopted and considered by project managers who have experience and knowledge in applying the techniques of the system of engineering construction management. The aim of this research is to determine the benefits of the implications of construction management by the construction team and level of considerations of the techniques and processes during the project development and construction phases to avoid any delay in the projects. It also aims to determine the factors that participate to project completion delays in case project managers are not well committed to their roles and responsibilities. The results of the analysis will determine the necessity of the applications required by the project team to avoid the causes of delays that help them deliver projects on time, e.g. verifying tender documents, quantities and preparing the construction method of the project.Keywords: construction management, control process, cost control, planning and scheduling
Procedia PDF Downloads 24716029 A FE-Based Scheme for Computing Wave Interaction with Nonlinear Damage and Generation of Harmonics in Layered Composite Structures
Authors: R. K. Apalowo, D. Chronopoulos
Abstract:
A Finite Element (FE) based scheme is presented for quantifying guided wave interaction with Localised Nonlinear Structural Damage (LNSD) within structures of arbitrary layering and geometric complexity. The through-thickness mode-shape of the structure is obtained through a wave and finite element method. This is applied in a time domain FE simulation in order to generate time harmonic excitation for a specific wave mode. Interaction of the wave with LNSD within the system is computed through an element activation and deactivation iteration. The scheme is validated against experimental measurements and a WFE-FE methodology for calculating wave interaction with damage. Case studies for guided wave interaction with crack and delamination are presented to verify the robustness of the proposed method in classifying and identifying damage.Keywords: layered structures, nonlinear ultrasound, wave interaction with nonlinear damage, wave finite element, finite element
Procedia PDF Downloads 16316028 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy
Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu
Abstract:
The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis
Procedia PDF Downloads 6516027 FESA: Fuzzy-Controlled Energy-Efficient Selective Allocation and Reallocation of Tasks Among Mobile Robots
Authors: Anuradha Banerjee
Abstract:
Energy aware operation is one of the visionary goals in the area of robotics because operability of robots is greatly dependent upon their residual energy. Practically, the tasks allocated to robots carry different priority and often an upper limit of time stamp is imposed within which the task needs to be completed. If a robot is unable to complete one particular task given to it the task is reallocated to some other robot. The collection of robots is controlled by a Central Monitoring Unit (CMU). Selection of the new robot is performed by a fuzzy controller called Task Reallocator (TRAC). It accepts the parameters like residual energy of robots, possibility that the task will be successfully completed by the new robot within stipulated time, distance of the new robot (where the task is reallocated) from distance of the old one (where the task was going on) etc. The proposed methodology increases the probability of completing globally assigned tasks and saves huge amount of energy as far as the collection of robots is concerned.Keywords: energy-efficiency, fuzzy-controller, priority, reallocation, task
Procedia PDF Downloads 31616026 Towards Positive Identity Construction for Japanese Non-Native English Language Teachers
Authors: Yumi Okano
Abstract:
The low level of English proficiency among Japanese people has been a problem for a long time. Japanese non-native English language teachers, under social or ideological constraints, feel a gap between government policy and their language proficiency and cannot maintain high self-esteem. This paper focuses on current Japanese policies and the social context in which teachers are placed and examines the measures necessary for their positive identity formation from a macro-meso-micro perspective. Some suggestions for achieving this are: 1) Teachers should free themselves from the idea of native speakers and embrace local needs and accents, 2) Teachers should be involved in student discussions as facilitators and individuals so that they can be good role models for their students, and 3) Teachers should invest in their classrooms. 4) Guidelines and training should be provided to help teachers gain confidence. In addition to reducing the workload to make more time available, 5) expanding opportunities for investment outside the classroom into the real world is necessary.Keywords: language teacher identity, native speakers, government policy, critical pedagogy, investment
Procedia PDF Downloads 10316025 Core Number Optimization Based Scheduler to Order/Mapp Simulink Application
Authors: Asma Rebaya, Imen Amari, Kaouther Gasmi, Salem Hasnaoui
Abstract:
Over these last years, the number of cores witnessed a spectacular increase in digital signal and general use processors. Concurrently, significant researches are done to get benefit from the high degree of parallelism. Indeed, these researches are focused to provide an efficient scheduling from hardware/software systems to multicores architecture. The scheduling process consists on statically choose one core to execute one task and to specify an execution order for the application tasks. In this paper, we describe an efficient scheduler that calculates the optimal number of cores required to schedule an application, gives a heuristic scheduling solution and evaluates its cost. Our proposal results are evaluated and compared with Preesm scheduler results and we prove that ours allows better scheduling in terms of latency, computation time and number of cores.Keywords: computation time, hardware/software system, latency, optimization, multi-cores platform, scheduling
Procedia PDF Downloads 28416024 Effectiveness of Multi-Business Core Development Policy in Tokyo Metropolitan Area
Authors: Takashi Nakamura
Abstract:
In the Tokyo metropolitan area, traffic congestion and long commute times are caused by overconcentration in the central area. To resolve these problems, a core business city development policy was adopted in 1988. The core business cities, which include Yokohama, Chiba, Saitama, Tachikawa, and others, have designated business facilities accumulation districts where assistance measures are applied. Focusing on Yokohama city, this study investigates the trends in the number of offices, employees, and commuters at 2001 and 2012 Economic Census, as well as the average commute time in the Tokyo metropolitan area from 2005 to 2015 Metropolitan Transportation Census. Surveys were administered in 2001 and 2012 Economic Census to participants who worked in Yokohama, according to their distribution in the city's 1,757 subregions. Four main findings emerged: (1) The number of offices increased in Yokohama when the number of offices decreased in the Tokyo metropolitan area overall. Additionally, the number of employees at Yokohama increased. (2) The number of commuters to Tokyo's central area increased from Saitama prefecture, Tokyo Tama area, and Tokyo central area. However, it decreased from other areas. (3) The average commute time in the Tokyo metropolitan area was 67.7 minutes in 2015, a slight decrease from 2005 and 2010. (4) The number of employees at business facilities accumulation districts in Yokohama city increased greatly.Keywords: core business city development policy, commute time, number of employees, Yokohama city, distribution of employees
Procedia PDF Downloads 14316023 Comparison of Different DNA Extraction Platforms with FFPE tissue
Authors: Wang Yanping Karen, Mohd Rafeah Siti, Park MI Kyoung
Abstract:
Formalin-fixed paraffin embedded (FFPE) tissue is important in the area of oncological diagnostics. This method of preserving tissues enabling them to be stored easily at ambient temperature for a long time. This decreases the risk of losing the DNA quantity and quality after extraction, reducing sample wastage, and making FFPE more cost effective. However, extracting DNA from FFPE tissue is a challenge as DNA purified is often highly cross-linked, fragmented, and degraded. In addition, this causes problems for many downstream processes. In this study, there will be a comparison of DNA extraction efficiency between One BioMed’s Xceler8 automated platform with commercial available extraction kits (Qiagen and Roche). The FFPE tissue slices were subjected to deparaffinization process, pretreatment and then DNA extraction using the three mentioned platforms. The DNA quantity were determined with real-time PCR (BioRad CFX ) and gel electrophoresis. The amount of DNA extracted with the One BioMed’s X8 platform was found to be comparable with the other two manual extraction kits.Keywords: DNA extraction, FFPE tissue, qiagen, roche, one biomed X8
Procedia PDF Downloads 10716022 Efficiency of Treatment in Patients with Newly Diagnosed Destructive Pulmonary Tuberculosis Using Intravenous Chemotherapy
Authors: M. Kuzhko, M. Gumeniuk, D. Butov, T. Tlustova, O. Denysov, T. Sprynsian
Abstract:
Background: The aim of the research was to determine the effectiveness of chemotherapy using intravenous antituberculosis drugs compared with their oral administration during the intensive phase of treatment. Methods: 152 tuberculosis patients were randomized into 2 groups: Main (n=65) who received isoniazid, ethambutol and sodium rifamycin intravenous + pyrazinamide per os and control (n=87) who received all the drugs (isoniazid, rifampicin, ethambutol, pyrazinamide) orally. Results: After 2 weeks of treatment symptoms of intoxication disappeared in 59 (90.7±3.59 %) of patients of the main group and 60 (68.9±4.9 %) patients in the control group, p<0.05. The mean duration of symptoms of intoxication in patients main group was 9.6±0.7 days, in control group – 13.7±0.9 days. After completing intensive phase sputum conversion was found in all the patients main group and 71 (81.6±4.1 %) patients control group p < 0.05. The average time of sputum conversion in main group was 1.6±0.1 months and 1.9±0.1 months in control group, p > 0.05. In patients with destructive pulmonary tuberculosis time to sputum conversion was 1.7±0.1 months in main group and 2.2±0.2 months in control group, p < 0.05. The average time of cavities healing in main group was 2.9±0.2 months and 3.9±0.2 months in the control group, p < 0.05. Conclusions: In patients with newly diagnosed destructive pulmonary tuberculosis use of isoniazid, ethambutol and sodium rifamycin intravenous in the intensive phase of chemotherapy resulted in a significant reduction in terms of the disappearance of symptoms of intoxication and sputum conversion.Keywords: intravenous chemotherapy, tuberculosis, treatment efficiency, tuberculosis drugs
Procedia PDF Downloads 20216021 Accurate Calculation of the Penetration Depth of a Bullet Using ANSYS
Authors: Eunsu Jang, Kang Park
Abstract:
In developing an armored ground combat vehicle (AGCV), it is a very important step to analyze the vulnerability (or the survivability) of the AGCV against enemy’s attack. In the vulnerability analysis, the penetration equations are usually used to get the penetration depth and check whether a bullet can penetrate the armor of the AGCV, which causes the damage of internal components or crews. The penetration equations are derived from penetration experiments which require long time and great efforts. However, they usually hold only for the specific material of the target and the specific type of the bullet used in experiments. Thus, penetration simulation using ANSYS can be another option to calculate penetration depth. However, it is very important to model the targets and select the input parameters in order to get an accurate penetration depth. This paper performed a sensitivity analysis of input parameters of ANSYS on the accuracy of the calculated penetration depth. Two conflicting objectives need to be achieved in adopting ANSYS in penetration analysis: maximizing the accuracy of calculation and minimizing the calculation time. To maximize the calculation accuracy, the sensitivity analysis of the input parameters for ANSYS was performed and calculated the RMS error with the experimental data. The input parameters include mesh size, boundary condition, material properties, target diameter are tested and selected to minimize the error between the calculated result from simulation and the experiment data from the papers on the penetration equation. To minimize the calculation time, the parameter values obtained from accuracy analysis are adjusted to get optimized overall performance. As result of analysis, the followings were found: 1) As the mesh size gradually decreases from 0.9 mm to 0.5 mm, both the penetration depth and calculation time increase. 2) As diameters of the target decrease from 250mm to 60 mm, both the penetration depth and calculation time decrease. 3) As the yield stress which is one of the material property of the target decreases, the penetration depth increases. 4) The boundary condition with the fixed side surface of the target gives more penetration depth than that with the fixed side and rear surfaces. By using above finding, the input parameters can be tuned to minimize the error between simulation and experiments. By using simulation tool, ANSYS, with delicately tuned input parameters, penetration analysis can be done on computer without actual experiments. The data of penetration experiments are usually hard to get because of security reasons and only published papers provide them in the limited target material. The next step of this research is to generalize this approach to anticipate the penetration depth by interpolating the known penetration experiments. This result may not be accurate enough to be used to replace the penetration experiments, but those simulations can be used in the early stage of the design process of AGCV in modelling and simulation stage.Keywords: ANSYS, input parameters, penetration depth, sensitivity analysis
Procedia PDF Downloads 40116020 Thermoelectric Properties of Doped Polycrystalline Silicon Film
Authors: Li Long, Thomas Ortlepp
Abstract:
The transport properties of carriers in polycrystalline silicon film affect the performance of polycrystalline silicon-based devices. They depend strongly on the grain structure, grain boundary trap properties and doping concentration, which in turn are determined by the film deposition and processing conditions. Based on the properties of charge carriers, phonons, grain boundaries and their interactions, the thermoelectric properties of polycrystalline silicon are analyzed with the relaxation time approximation of the Boltz- mann transport equation. With this approach, thermal conductivity, electrical conductivity and Seebeck coefficient as a function of grain size, trap properties and doping concentration can be determined. Experiment on heavily doped polycrystalline silicon is carried out and measurement results are compared with the model.Keywords: conductivity, polycrystalline silicon, relaxation time approximation, Seebeck coefficient, thermoelectric property
Procedia PDF Downloads 12416019 Detection of Voltage Sag and Voltage Swell in Power Quality Using Wavelet Transforms
Authors: Nor Asrina Binti Ramlee
Abstract:
Voltage sag, voltage swell, high-frequency noise and voltage transients are kinds of disturbances in power quality. They are also known as power quality events. Equipment used in the industry nowadays has become more sensitive to these events with the increasing complexity of equipment. This leads to the importance of distributing clean power quality to the consumer. To provide better service, the best analysis on power quality is very vital. Thus, this paper presents the events detection focusing on voltage sag and swell. The method is developed by applying time domain signal analysis using wavelet transform approach in MATLAB. Four types of mother wavelet namely Haar, Dmey, Daubechies, and Symlet are used to detect the events. This project analyzed real interrupted signal obtained from 22 kV transmission line in Skudai, Johor Bahru, Malaysia. The signals will be decomposed through the wavelet mothers. The best mother is the one that is capable to detect the time location of the event accurately.Keywords: power quality, voltage sag, voltage swell, wavelet transform
Procedia PDF Downloads 37216018 A Portable Device for Pulse Wave Velocity Measurements
Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen
Abstract:
Pulse wave velocity (PWV) of blood flow provides important information of vessel property and blood pressure which can be used to assess cardiovascular disease. However, the above measurements need expensive equipment, such as Doppler ultrasound, MRI, angiography etc. The photoplethysmograph (PPG) signals are commonly utilized to detect blood volume changes. In this study, two infrared (IR) probes are designed and placed at a fixed distance from finger base and fingertip. An analog circuit with automatic gain adjustment is implemented to get the stable original PPG signals from above two IR probes. In order to obtain the time delay precisely between two PPG signals, we obtain the pulse transit time from the second derivative of the original PPG signals. To get a portable, wireless and low power consumption PWV measurement device, the low energy Bluetooth 4.0 (BLE) and the microprocessor (Cortex™-M3) are used in this study. The PWV is highly correlated with blood pressure. This portable device has potential to be used for continuous blood pressure monitoring.Keywords: pulse wave velocity, photoplethysmography, portable device, biomedical engineering
Procedia PDF Downloads 52716017 Analysing Environmental Licensing of Infrastructure Projects in Brazil
Authors: Ronaldo Seroa Da Motta, Gabriela Santiago
Abstract:
The main contribution of this study is the identification of the factors influencing the environmental licensing process of infrastructure projects in Brazil. These factors will be those that reflect the technical characteristics of the project, the corporate governance of the entrepreneur, and the institutional and regulatory governance of the environmental agency, including the number of interventions by non-licensing agencies. The model conditions these variables to the licensing processing time of 34 infrastructure projects. Our results indicated that the conditions would be more sensitive to the type of enterprise, complexity as in gas pipelines and hydroelectric plants in the most vulnerable biome with a greater value of the enterprise or the entrepreneur's assets, together with the number of employees of the licensing agency. The number of external interventions by other non-licensing institutions does not affect the licensing time. Such results challenge the current criticism that environmental licensing has been often pointed out as a barrier to speed up investments in infrastructure projects in Brazil due to the participation of civil society and other non-licensing institutions.Keywords: environmental licensing, condionants, Brazil, timing process
Procedia PDF Downloads 13516016 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing
Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed
Abstract:
Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.Keywords: cognitive radio, energy detector, periodogram, spectrum sensing
Procedia PDF Downloads 37816015 Networked Radar System to Increase Safety of Urban Railroad Crossing
Authors: Sergio Saponara, Luca Fanucci, Riccardo Cassettari, Ruggero Piernicola, Marco Righetto
Abstract:
The paper presents an innovative networked radar system for detection of obstacles in a railway level crossing scenario. This Monitoring System (MS) is able to detect moving or still obstacles within the railway level crossing area automatically, avoiding the need of human presence for surveillance. The MS is also connected to the National Railway Information and Signaling System to communicate in real-time the level crossing status. The architecture is compliant with the highest Safety Integrity Level (SIL4) of the CENELEC standard. The number of radar sensors used is configurable at set-up time and depends on how large the level crossing area can be. At least two sensors are expected and up four can be used for larger areas. The whole processing chain that elaborates the output sensor signals, as well as the communication interface, is fully-digital, was designed in VHDL code and implemented onto a Xilinx Virtex 6.Keywords: radar for safe mobility, railroad crossing, railway, transport safety
Procedia PDF Downloads 48016014 Parking Space Detection and Trajectory Tracking Control for Vehicle Auto-Parking
Authors: Shiuh-Jer Huang, Yu-Sheng Hsu
Abstract:
On-board available parking space detecting system, parking trajectory planning and tracking control mechanism are the key components of vehicle backward auto-parking system. Firstly, pair of ultrasonic sensors is installed on each side of vehicle body surface to detect the relative distance between ego-car and surrounding obstacle. The dimension of a found empty space can be calculated based on vehicle speed and the time history of ultrasonic sensor detecting information. This result can be used for constructing the 2D vehicle environmental map and available parking type judgment. Finally, the auto-parking controller executes the on-line optimal parking trajectory planning based on this 2D environmental map, and monitors the real-time vehicle parking trajectory tracking control. This low cost auto-parking system was tested on a model car.Keywords: vehicle auto-parking, parking space detection, parking path tracking control, intelligent fuzzy controller
Procedia PDF Downloads 24516013 A High Performance Piano Note Recognition Scheme via Precise Onset Detection and Segmented Short-Time Fourier Transform
Authors: Sonali Banrjee, Swarup Kumar Mitra, Aritra Acharyya
Abstract:
A piano note recognition method has been proposed by the authors in this paper. The authors have used a comprehensive method for onset detection of each note present in a piano piece followed by segmented short-time Fourier transform (STFT) for the identification of piano notes. The performance evaluation of the proposed method has been carried out in different harsh noisy environments by adding different levels of additive white Gaussian noise (AWGN) having different signal-to-noise ratio (SNR) in the original signal and evaluating the note detection error rate (NDER) of different piano pieces consisting of different number of notes at different SNR levels. The NDER is found to be remained within 15% for all piano pieces under consideration when the SNR is kept above 8 dB.Keywords: AWGN, onset detection, piano note, STFT
Procedia PDF Downloads 16016012 Portable Cardiac Monitoring System Based on Real-Time Microcontroller and Multiple Communication Interfaces
Authors: Ionel Zagan, Vasile Gheorghita Gaitan, Adrian Brezulianu
Abstract:
This paper presents the contributions in designing a mobile system named Tele-ECG implemented for remote monitoring of cardiac patients. For a better flexibility of this application, the authors chose to implement a local memory and multiple communication interfaces. The project described in this presentation is based on the ARM Cortex M0+ microcontroller and the ADAS1000 dedicated chip necessary for the collection and transmission of Electrocardiogram signals (ECG) from the patient to the microcontroller, without altering the performances and the stability of the system. The novelty brought by this paper is the implementation of a remote monitoring system for cardiac patients, having a real-time behavior and multiple interfaces. The microcontroller is responsible for processing digital signals corresponding to ECG and also for the implementation of communication interface with the main server, using GSM/Bluetooth SIMCOM SIM800C module. This paper translates all the characteristics of the Tele-ECG project representing a feasible implementation in the biomedical field. Acknowledgment: This paper was supported by the project 'Development and integration of a mobile tele-electrocardiograph in the GreenCARDIO© system for patients monitoring and diagnosis - m-GreenCARDIO', Contract no. BG58/30.09.2016, PNCDI III, Bridge Grant 2016, using the infrastructure from the project 'Integrated Center for research, development and innovation in Advanced Materials, Nanotechnologies, and Distributed Systems for fabrication and control', Contract No. 671/09.04.2015, Sectoral Operational Program for Increase of the Economic Competitiveness co-funded from the European Regional Development Fund.Keywords: Tele-ECG, real-time cardiac monitoring, electrocardiogram, microcontroller
Procedia PDF Downloads 27216011 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices
Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays
Abstract:
Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.Keywords: ecological momentary assessment, real-time, stress, work
Procedia PDF Downloads 16116010 Evaluation of Mixing and Oxygen Transfer Performances for a Stirred Bioreactor Containing P. chrysogenum Broths
Authors: A. C. Blaga, A. Cârlescu, M. Turnea, A. I. Galaction, D. Caşcaval
Abstract:
The performance of an aerobic stirred bioreactor for fungal fermentation was analyzed on the basis of mixing time and oxygen mass transfer coefficient, by quantifying the influence of some specific geometrical and operational parameters of the bioreactor, as well as the rheological behavior of Penicillium chrysogenum broth (free mycelia and mycelia aggregates). The rheological properties of the fungus broth, controlled by the biomass concentration, its growth rate, and morphology strongly affect the performance of the bioreactor. Experimental data showed that for both morphological structures the accumulation of fungus biomass induces a significant increase of broths viscosity and modifies the rheological behavior. For lower P. chrysogenum concentrations (both morphological conformations), the mixing time initially increases with aeration rate, reaches a maximum value and decreases. This variation can be explained by the formation of small bubbles, due to the presence of solid phase which hinders the bubbles coalescence, the rising velocity of bubbles being reduced by the high apparent viscosity of fungus broths. By biomass accumulation, the variation of mixing time with aeration rate is gradually changed, the continuous reduction of mixing time with air input flow increase being obtained for 33.5 g/l d.w. P. chrysogenum. Owing to the superior apparent viscosity, which reduces considerably the relative contribution of mechanical agitation to the broths mixing, these phenomena are more pronounced for P. chrysogenum free mycelia. Due to the increase of broth apparent viscosity, the biomass accumulation induces two significant effects on oxygen transfer rate: the diminution of turbulence and perturbation of bubbles dispersion - coalescence equilibrium. The increase of P. chrysogenum free mycelia concentration leads to the decrease of kla values. Thus, for the considered variation domain of the main parameters taken into account, namely air superficial velocity from 8.36 10-4 to 5.02 10-3 m/s and specific power input from 100 to 500 W/m3, kla was reduced for 3.7 times for biomass concentration increase from 4 to 36.5 g/l d.w. The broth containing P. crysogenum mycelia aggregates exhibits a particular behavior from the point of view of oxygen transfer. Regardless of bioreactor operating conditions, the increase of biomass concentration leads initially to the increase of oxygen mass transfer rate, the phenomenon that can be explained by the interaction of pellets with bubbles. The results are in relation with the increase of apparent viscosity of broths corresponding to the variation of biomass concentration between the mentioned limits. Thus, the apparent viscosity of the suspension of fungus mycelia aggregates increased for 44.2 times and fungus free mycelia for 63.9 times for CX increase from 4 to 36.5 g/l d.w. By means of the experimental data, some mathematical correlations describing the influences of the considered factors on mixing time and kla have been proposed. The proposed correlations can be used in bioreactor performance evaluation, optimization, and scaling-up.Keywords: biomass concentration, mixing time, oxygen mass transfer, P. chrysogenum broth, stirred bioreactor
Procedia PDF Downloads 34016009 A Physiological Approach for Early Detection of Hemorrhage
Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain
Abstract:
Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning
Procedia PDF Downloads 16716008 A Parallel Algorithm for Solving the PFSP on the Grid
Authors: Samia Kouki
Abstract:
Solving NP-hard combinatorial optimization problems by exact search methods, such as Branch-and-Bound, may degenerate to complete enumeration. For that reason, exact approaches limit us to solve only small or moderate size problem instances, due to the exponential increase in CPU time when problem size increases. One of the most promising ways to reduce significantly the computational burden of sequential versions of Branch-and-Bound is to design parallel versions of these algorithms which employ several processors. This paper describes a parallel Branch-and-Bound algorithm called GALB for solving the classical permutation flowshop scheduling problem as well as its implementation on a Grid computing infrastructure. The experimental study of our distributed parallel algorithm gives promising results and shows clearly the benefit of the parallel paradigm to solve large-scale instances in moderate CPU time.Keywords: grid computing, permutation flow shop problem, branch and bound, load balancing
Procedia PDF Downloads 28316007 Adsorption of Xylene Cyanol FF onto Activated Carbon from Brachystegia Eurycoma Seed Hulls: Determination of the Optimal Conditions by Statistical Design of Experiments
Authors: F. G Okibe, C. E Gimba, V. O Ajibola, I. G Ndukwe, E. D. Paul
Abstract:
A full factorial experimental design technique at two levels and four factors (24) was used to optimize the adsorption at 615 nm of Xylene Cyanol ff in aqueous solutions onto activated carbon prepared from brachystegia eurycoma seed hulls by chemical carbonization method. The effect of pH (3 and 5), initial dye concentration (20 and 60 mg/l), adsorbent dosage (0.01 and 0.05 g), and contact time (30 and 60 min) on removal efficiency of the adsorbent for the dye were investigated at 298K. From the analysis of variance, response surface and cube plot, adsorbent dosage was observed to be the most significant factor affecting the adsorption process. However, from the interaction between the variables studied, the optimum removal efficiency was 96.80 % achieved with adsorbent dosage of 0.05 g, contact time 45 minutes, pH 3, and initial dye concentration 60 mg/l.Keywords: factorial experimental design, adsorption, optimization, brachystegia eurycoma, xylene cyanol ff
Procedia PDF Downloads 40016006 Evaluation of Sequential Polymer Flooding in Multi-Layered Heterogeneous Reservoir
Authors: Panupong Lohrattanarungrot, Falan Srisuriyachai
Abstract:
Polymer flooding is a well-known technique used for controlling mobility ratio in heterogeneous reservoirs, leading to improvement of sweep efficiency as well as wellbore profile. However, low injectivity of viscous polymer solution attenuates oil recovery rate and consecutively adds extra operating cost. An attempt of this study is to improve injectivity of polymer solution while maintaining recovery factor, enhancing effectiveness of polymer flooding method. This study is performed by using reservoir simulation program to modify conventional single polymer slug into sequential polymer flooding, emphasizing on increasing of injectivity and also reduction of polymer amount. Selection of operating conditions for single slug polymer including pre-injected water, polymer concentration and polymer slug size is firstly performed for a layered-heterogeneous reservoir with Lorenz coefficient (Lk) of 0.32. A selected single slug polymer flooding scheme is modified into sequential polymer flooding with reduction of polymer concentration in two different modes: Constant polymer mass and reduction of polymer mass. Effects of Residual Resistance Factor (RRF) is also evaluated. From simulation results, it is observed that first polymer slug with the highest concentration has the main function to buffer between displacing phase and reservoir oil. Moreover, part of polymer from this slug is also sacrificed for adsorption. Reduction of polymer concentration in the following slug prevents bypassing due to unfavorable mobility ratio. At the same time, following slugs with lower viscosity can be injected easily through formation, improving injectivity of the whole process. A sequential polymer flooding with reduction of polymer mass shows great benefit by reducing total production time and amount of polymer consumed up to 10% without any downside effect. The only advantage of using constant polymer mass is slightly increment of recovery factor (up to 1.4%) while total production time is almost the same. Increasing of residual resistance factor of polymer solution yields a benefit on mobility control by reducing effective permeability to water. Nevertheless, higher adsorption results in low injectivity, extending total production time. Modifying single polymer slug into sequence of reduced polymer concentration yields major benefits on reducing production time as well as polymer mass. With certain design of polymer flooding scheme, recovery factor can even be further increased. This study shows that application of sequential polymer flooding can be certainly applied to reservoir with high value of heterogeneity since it requires nothing complex for real implementation but just a proper design of polymer slug size and concentration.Keywords: polymer flooding, sequential, heterogeneous reservoir, residual resistance factor
Procedia PDF Downloads 47616005 Challenge Response-Based Authentication for a Mobile Voting System
Authors: Tohari Ahmad, Hudan Studiawan, Iwang Aryadinata, Royyana M. Ijtihadie, Waskitho Wibisono
Abstract:
A manual voting system has been implemented worldwide. It has some weaknesses which may decrease the legitimacy of the voting result. An electronic voting system is introduced to minimize this weakness. It has been able to provide a better result, in terms of the total time taken in the voting process and accuracy. Nevertheless, people may be reluctant to go to the polling location because of some reasons, such as distance and time. In order to solve this problem, mobile voting is implemented by utilizing mobile devices. There are many mobile voting architectures available. Overall, authenticity of the users is the common problem of all voting systems. There must be a mechanism which can verify the users’ authenticity such that only verified users can give their vote once; others cannot vote. In this paper, a challenge response-based authentication is proposed by utilizing properties of the users, for example, something they have and know. In terms of speed, the proposed system provides good result, in addition to other capabilities offered by the system.Keywords: authentication, data protection, mobile voting, security
Procedia PDF Downloads 41916004 Experimental Study on Two-Step Pyrolysis of Automotive Shredder Residue
Authors: Letizia Marchetti, Federica Annunzi, Federico Fiorini, Cristiano Nicolella
Abstract:
Automotive shredder residue (ASR) is a mixture of waste that makes up 20-25% of end-of-life vehicles. For many years, ASR was commonly disposed of in landfills or incinerated, causing serious environmental problems. Nowadays, thermochemical treatments are a promising alternative, although the heterogeneity of ASR still poses some challenges. One of the emerging thermochemical treatments for ASR is pyrolysis, which promotes the decomposition of long polymeric chains by providing heat in the absence of an oxidizing agent. In this way, pyrolysis promotes the conversion of ASR into solid, liquid, and gaseous phases. This work aims to improve the performance of a two-step pyrolysis process. After the characterization of the analysed ASR, the focus is on determining the effects of residence time on product yields and gas composition. A batch experimental setup that reproduces the entire process was used. The setup consists of three sections: the pyrolysis section (made of two reactors), the separation section, and the analysis section. Two different residence times were investigated to find suitable conditions for the first sample of ASR. These first tests showed that the products obtained were more sensitive to residence time in the second reactor. Indeed, slightly increasing residence time in the second reactor managed to raise the yield of gas and carbon residue and decrease the yield of liquid fraction. Then, to test the versatility of the setup, the same conditions were applied to a different sample of ASR coming from a different chemical plant. The comparison between the two ASR samples shows that similar product yields and compositions are obtained using the same setup.Keywords: automotive shredder residue, experimental tests, heterogeneity, product yields, two-step pyrolysis
Procedia PDF Downloads 12716003 Creating Risk Maps on the Spatiotemporal Occurrence of Agricultural Insecticides in Sub-Saharan Africa
Authors: Chantal Hendriks, Harry Gibson, Anna Trett, Penny Hancock, Catherine Moyes
Abstract:
The use of modern inputs for crop protection, such as insecticides, is strongly underestimated in Sub-Saharan Africa. Several studies measured toxic concentrations of insecticides in fruits, vegetables and fish that were cultivated in Sub-Saharan Africa. The use of agricultural insecticides has impact on human and environmental health, but it also has the potential to impact on insecticide resistance in malaria transmitting mosquitos. To analyse associations between historic use of agricultural insecticides and the distribution of insecticide resistance through space and time, the use and environmental fate of agricultural insecticides needs to be mapped through the same time period. However, data on the use and environmental fate of agricultural insecticides in Africa are limited and therefore risk maps on the spatiotemporal occurrence of agricultural insecticides are created using environmental data. Environmental data on crop density and crop type were used to select the areas that most likely receive insecticides. These areas were verified by a literature review and expert knowledge. Pesticide fate models were compared to select most dominant processes that are involved in the environmental fate of insecticides and that can be mapped at a continental scale. The selected processes include: surface runoff, erosion, infiltration, volatilization and the storing and filtering capacity of soils. The processes indicate the risk for insecticide accumulation in soil, water, sediment and air. A compilation of all available data for traces of insecticides in the environment was used to validate the maps. The risk maps can result in space and time specific measures that reduce the risk of insecticide exposure to non-target organisms.Keywords: crop protection, pesticide fate, tropics, insecticide resistance
Procedia PDF Downloads 141