Search results for: Just in Time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18153

Search results for: Just in Time

16293 An Optimization Model for Maximum Clique Problem Based on Semidefinite Programming

Authors: Derkaoui Orkia, Lehireche Ahmed

Abstract:

The topic of this article is to exploring the potentialities of a powerful optimization technique, namely Semidefinite Programming, for solving NP-hard problems. This approach provides tight relaxations of combinatorial and quadratic problems. In this work, we solve the maximum clique problem using this relaxation. The clique problem is the computational problem of finding cliques in a graph. It is widely acknowledged for its many applications in real-world problems. The numerical results show that it is possible to find a maximum clique in polynomial time, using an algorithm based on semidefinite programming. We implement a primal-dual interior points algorithm to solve this problem based on semidefinite programming. The semidefinite relaxation of this problem can be solved in polynomial time.

Keywords: semidefinite programming, maximum clique problem, primal-dual interior point method, relaxation

Procedia PDF Downloads 222
16292 Family Caregiver Transitions and Health in Old Age: A Longitudinal Perspective

Authors: Cecilia Fagerstrom, Solve Elmstahl, Lena S. Wranker

Abstract:

The conditions of increased morbidity in an aging population cause the need for family care to become more common at an advanced age. The role of family caregivers may well last for a long time but may also change over time, from being caregivers to being non-caregivers or vice versa. Although demands associated with family caring change as individuals enter into, engage with, and exit from this role, the evidence regarding the impact of family caregiving transitions on the health of older carers is still limited. This study comprised individuals (n=2294, 60+years) from the southern part of Sweden included in the project Swedish National study of Aging and Care. Caregiving transitions are discussed in the categories: enter, exit, and continuing during a six-year period. Individuals who exited caregiving during the time were older than those who continued or entered into the role of caregiving. At the six-year follow-up, caregivers who were continuing or had exited caregiving were more often worried about their own health compared to baseline. Resembling findings were not found in those who entered caregiving. Family caregiving transitions of exiting, entering or continuing had no effect on the individuals’ functional, physical and mental health expect for participants who entered in caregiving. For them, entering the role of family caregiving was associated with an improvement in physical health during the six years follow up period. Conclusion: Although the health impact of different caregiving transitions in late life does not differ, individual conditions and health at baseline are important parameters to take into consideration to improve long-term health in family caregivers.

Keywords: family caregiving, health, old age, transition

Procedia PDF Downloads 219
16291 Evaluation of Esters Production by Oleic Acid Epoxidation Reaction

Authors: Flavio A. F. Da Ponte, Jackson Q. Malveira, Monica C. G. Albuquerque

Abstract:

In recent years a worldwide interest in renewable resources from the biomass has spurred the industry. In this work the chemical structure of oleic acid chains was modified by homogeneous and heterogeneous catalysis in order to produce esters. The homogeneous epoxidation was carried out at H2O2 to oleic acid unsaturation molar ratio of 20:1. The reaction temperature was 338 K and reaction time 16 h. Formic acid was used as catalyst. For heterogeneous catalysis reaction temperature was 343 K and reaction time 24 h. The esters production was carried out by heterogeneous catalysis of the epoxidized oleic acid and butanol using Mg/SBA-15 as catalyst. The resulting products were confirmed by NMR (1H and 13C) and FTIR spectroscopy. The products were characterized before and after each reaction. The catalysts were characterized by X-ray diffraction, X-ray fluorescence, thermogravimetric analysis (TGA) and BET surface areas. The results were satisfactory for the bioproducts formed.

Keywords: acid oleic, bioproduct, esters, epoxidation

Procedia PDF Downloads 356
16290 A Comprehensive Evaluation of Supervised Machine Learning for the Phase Identification Problem

Authors: Brandon Foggo, Nanpeng Yu

Abstract:

Power distribution circuits undergo frequent network topology changes that are often left undocumented. As a result, the documentation of a circuit’s connectivity becomes inaccurate with time. The lack of reliable circuit connectivity information is one of the biggest obstacles to model, monitor, and control modern distribution systems. To enhance the reliability and efficiency of electric power distribution systems, the circuit’s connectivity information must be updated periodically. This paper focuses on one critical component of a distribution circuit’s topology - the secondary transformer to phase association. This topology component describes the set of phase lines that feed power to a given secondary transformer (and therefore a given group of power consumers). Finding the documentation of this component is call Phase Identification, and is typically performed with physical measurements. These measurements can take time lengths on the order of several months, but with supervised learning, the time length can be reduced significantly. This paper compares several such methods applied to Phase Identification for a large range of real distribution circuits, describes a method of training data selection, describes preprocessing steps unique to the Phase Identification problem, and ultimately describes a method which obtains high accuracy (> 96% in most cases, > 92% in the worst case) using only 5% of the measurements typically used for Phase Identification.

Keywords: distribution network, machine learning, network topology, phase identification, smart grid

Procedia PDF Downloads 299
16289 Phenols and Manganese Removal from Landfill Leachate and Municipal Waste Water Using the Constructed Wetland

Authors: Amin Mojiri, Lou Ziyang

Abstract:

Constructed wetland (CW) is a reasonable method to treat waste water. Current study was carried out to co-treat landfill leachate and domestic waste water using a CW system. Typha domingensis was transplanted to CW, which encloses two substrate layers of adsorbents named ZELIAC and zeolite. Response surface methodology and central composite design were employed to evaluate experimental data. Contact time (h) and leachate to waste water mixing ratio (%; v/v) were selected as independent factors. Phenols and manganese removal were selected as dependent responses. At optimum contact time (48.7 h) and leachate to waste water mixing ratio (20.0%), removal efficiencies of phenols and manganese removal efficiencies were 90.5%, and 89.4%, respectively.

Keywords: constructed wetland, Manganese, phenols, Thypha domingensis

Procedia PDF Downloads 322
16288 Association of the Time in Targeted Blood Glucose Range of 3.9–10 Mmol/L with the Mortality of Critically Ill Patients with or without Diabetes

Authors: Guo Yu, Haoming Ma, Peiru Zhou

Abstract:

BACKGROUND: In addition to hyperglycemia, hypoglycemia, and glycemic variability, a decrease in the time in the targeted blood glucose range (TIR) may be associated with an increased risk of death for critically ill patients. However, the relationship between the TIR and mortality may be influenced by the presence of diabetes and glycemic variability. METHODS: A total of 998 diabetic and non-diabetic patients with severe diseases in the ICU were selected for this retrospective analysis. The TIR is defined as the percentage of time spent in the target blood glucose range of 3.9–10.0 mmol/L within 24 hours. The relationship between TIR and in-hospital in diabetic and non-diabetic patients was analyzed. The effect of glycemic variability was also analyzed. RESULTS: The binary logistic regression model showed that there was a significant association between the TIR as a continuous variable and the in-hospital death of severely ill non-diabetic patients (OR=0.991, P=0.015). As a classification variable, TIR≥70% was significantly associated with in-hospital death (OR=0.581, P=0.003). Specifically, TIR≥70% was a protective factor for the in-hospital death of severely ill non-diabetic patients. The TIR of severely ill diabetic patients was not significantly associated with in-hospital death; however, glycemic variability was significantly and independently associated with in-hospital death (OR=1.042, P=0.027). Binary logistic regression analysis of comprehensive indices showed that for non-diabetic patients, the C3 index (low TIR & high CV) was a risk factor for increased mortality (OR=1.642, P<0.001). In addition, for diabetic patients, the C3 index was an independent risk factor for death (OR=1.994, P=0.008), and the C4 index (low TIR & low CV) was independently associated with increased survival. CONCLUSIONS: The TIR of non-diabetic patients during ICU hospitalization was associated with in-hospital death even after adjusting for disease severity and glycemic variability. There was no significant association between the TIR and mortality of diabetic patients. However, for both diabetic and non-diabetic critically ill patients, the combined effect of high TIR and low CV was significantly associated with ICU mortality. Diabetic patients seem to have higher blood glucose fluctuations and can tolerate a large TIR range. Both diabetic and non-diabetic critically ill patients should maintain blood glucose levels within the target range to reduce mortality.

Keywords: severe disease, diabetes, blood glucose control, time in targeted blood glucose range, glycemic variability, mortality

Procedia PDF Downloads 222
16287 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life

Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi

Abstract:

Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.

Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model

Procedia PDF Downloads 457
16286 Approaches to Reduce the Complexity of Mathematical Models for the Operational Optimization of Large-Scale Virtual Power Plants in Public Energy Supply

Authors: Thomas Weber, Nina Strobel, Thomas Kohne, Eberhard Abele

Abstract:

In context of the energy transition in Germany, the importance of so-called virtual power plants in the energy supply continues to increase. The progressive dismantling of the large power plants and the ongoing construction of many new decentralized plants result in great potential for optimization through synergies between the individual plants. These potentials can be exploited by mathematical optimization algorithms to calculate the optimal application planning of decentralized power and heat generators and storage systems. This also includes linear or linear mixed integer optimization. In this paper, procedures for reducing the number of decision variables to be calculated are explained and validated. On the one hand, this includes combining n similar installation types into one aggregated unit. This aggregated unit is described by the same constraints and target function terms as a single plant. This reduces the number of decision variables per time step and the complexity of the problem to be solved by a factor of n. The exact operating mode of the individual plants can then be calculated in a second optimization in such a way that the output of the individual plants corresponds to the calculated output of the aggregated unit. Another way to reduce the number of decision variables in an optimization problem is to reduce the number of time steps to be calculated. This is useful if a high temporal resolution is not necessary for all time steps. For example, the volatility or the forecast quality of environmental parameters may justify a high or low temporal resolution of the optimization. Both approaches are examined for the resulting calculation time as well as for optimality. Several optimization models for virtual power plants (combined heat and power plants, heat storage, power storage, gas turbine) with different numbers of plants are used as a reference for the investigation of both processes with regard to calculation duration and optimality.

Keywords: CHP, Energy 4.0, energy storage, MILP, optimization, virtual power plant

Procedia PDF Downloads 178
16285 Role of Activated Partial Thromboplastin Time (APTT) to Assess the Need of Platelet Transfusion in Dengue

Authors: Kalyan Koganti

Abstract:

Background: In India, platelet transfusions are given to large no. of patients suffering from dengue due to the fear of bleeding especially when the platelet counts are low. Though many patients do not bleed when the platelet count falls to less than 20,000, certain patients bleed even if the platelet counts are more than 20,000 without any comorbid condition (like gastrointestinal ulcer) in the past. This fear has led to huge amounts of unnecessary platelet transfusions which cause significant economic burden to low and middle-income countries like India and also sometimes these transfusions end with transfusion-related adverse reactions. Objective: To identify the role of Activated Partial Thromboplastin Time (APTT) in comparison with thrombocytoenia as an indicator to assess the real need of platelet transfusions. Method: A prospective study was conducted at a hospital in South India which included 176 admitted cases of dengue confirmed by immunochromatography. APTT was performed in all these patients along with platelet count. Cut off values of > 60 seconds for APTT and < 20,000 for platelet count were considered to assess the bleeding manifestations. Results: Among the total 176 patients, 56 patients had bleeding manifestations like malena, hematuria, bleeding gums etc. APTT > 60 seconds had a sensitivity and specificity of 93% and 90% respectively in identifying bleeding manifestations where as platelet count of < 20,000 had a sensitivity and specificity of 64% and 73% respectively. Conclusion: Elevated APTT levels can be considered as an indicator to assess the need of platelet transfusion in dengue. As there is a significant variation among patients who bleed with respect to platelet count, APTT can be considered to avoid unnecessary transfusions.

Keywords: activated partial thromboplastin time, dengue, platelet transfusion, thrombocytopenia

Procedia PDF Downloads 216
16284 Incorporating Moving Authority Limits Into Driving Advice

Authors: Peng Zhou, Peter Pudney

Abstract:

Driver advice systems are used by many rail operators to help train drivers to improve timekeeping while minimising energy use. These systems typically operate independently of the safeworking system, because information on how far the train is allowed to travel -the “limit of authority"- is usually not available as real-time data that can be used when generating driving advice. This is not an issue when there is sufficient separation between trains. But on systems with low headways, driving advice could conflict with safeworking requirements. We describe a method for generating driving advice that takes into account a moving limit of authority that is communicated to the train in real-time. We illustrate the method with four simulated examples using data from the Zhengzhou Metro. The method will allow driver advice systems to be used more effectively on railways with low headways.

Keywords: railway transportation, energy efficient train operation, optimal train control, safe separation

Procedia PDF Downloads 9
16283 Theoretical Approach to Kinetics of Transient Plasticity of Metals under Irradiation

Authors: Pavlo Selyshchev, Tetiana Didenko

Abstract:

Within the framework of the obstacle radiation hardening and the dislocation climb-glide model a theoretical approach is developed to describe peculiarities of transient plasticity of metal under irradiation. It is considered nonlinear dynamics of accumulation of point defects (vacancies and interstitial atoms). We consider metal under such stress and conditions of irradiation at which creep is determined by dislocation motion: dislocations climb obstacles and glide between obstacles. It is shown that the rivalry between vacancy and interstitial fluxes to dislocation leads to fractures of plasticity time dependence. Simulation and analysis of this phenomenon are performed. Qualitatively different regimes of transient plasticity under irradiation are found. The fracture time is obtained. The theoretical results are compared with the experimental ones.

Keywords: climb and glide of dislocations, fractures of transient plasticity, irradiation, non-linear feed-back, point defects

Procedia PDF Downloads 202
16282 Comparison of the Glidescope Visualization and Neck Flexion with Lateral Neck Pressure Nasogastric Tube Insertion Techniques in Anaesthetized Patients: A Prospective Randomized Clinical Study

Authors: Pitchaporn Purngpiputtrakul, Suttasinee Petsakul, Sunisa Chatmongkolchart

Abstract:

Nasogastric tube (NGT) insertion in anaesthetized and intubated patients can be challenging even for experienced anesthesiologists. Various techniques have been proposed to facilitate NGT insertion in these patients. This study aimed to compare the success rate and time required for NGT insertion between the GlideScope visualization and neck flexion with lateral neck pressure techniques. This randomized clinical trial was performed at a teaching hospital on 86 adult patients undergoing abdominal surgery under relaxant general anaesthesia who required intraoperative NGT insertion. The patients were randomized into two groups, the GlideScope group (group G) and the neck flexion with lateral neck pressure group (group F). The success rate of first and second attempts, duration of insertion, and complications were recorded. The total success rate was 79.1% in Group G compared with 76.7% in Group F (P=1) The median time required for NGT insertion was significantly longer in Group G, for both first and second attempts (97 vs 42 seconds P<0.001) and (70 vs 48.5 seconds P=0.015), respectively. Complications were reported in 23 patients (53.5%) in group G and 13 patients (30.2%) in group F. Bleeding and kinking were the most common complications in both techniques. Using GlideScope visualization to facilitate NGT insertion was comparable to neck flexion with lateral neck pressure technique in degree of success rate of insertion, while neck flexion with lateral neck pressure technique had fewer complications and was less time-consuming.

Keywords: anaesthesia, nasogastric tube, GlideScope, intubation

Procedia PDF Downloads 165
16281 Development the Sensor Lock Knee Joint and Evaluation of Its Effect on Walking and Energy Consumption in Subjects With Quadriceps Weakness

Authors: Mokhtar Arazpour

Abstract:

Objectives: Recently a new kind of stance control knee joint has been developed called the 'sensor lock.' This study aimed to develop and evaluate 'sensor lock', which could potentially solve the problems of walking parameters and gait symmetry in subjects with quadriceps weakness. Methods: Nine subjects with quadriceps weakness were enrolled in this study. A custom-made knee ankle foot orthosis (KAFO) with the same set of components was constructed for each participant. Testing began after orthotic gait training was completed with each of the KAFOs and subjects demonstrated that they could safely walk with crutches. Subjects rested 30 minutes between each trial. The 10 meters walking test is used to assess walking speed in meters/second (m/s). The total time taken to ambulate 6 meters (m) is recorded to the nearest hundredth of a second. 6 m is then divided by the total time (in seconds) taken to ambulate and recorded in m/s. The 6 Minutes Walking Test was used to assess walking endurance in this study. Participants walked around the perimeter of a set circuit for a total of six minutes. To evaluate Physiological cost index (PCI), the subjects were asked to walk using each type of KAFOs along a pre-determined 40 m rectangular walkway at their comfortable self-selected speed. A stopwatch was used to calculate the speed of walking by measuring the time between starting and stopping time and the distance walked. Results: The use of a KAFO fitted with the “sensor lock” knee joint resulted in improvements to walking speed, distance walked and physiological cost index when compared with the knee joint in lock mode. Conclusions: This study demonstrated that the use of a KAFO with the “sensor lock” knee joint could provide significant benefits for subjects with a quadriceps weakness when compared to a KAFO with the knee joint in lock mode.

Keywords: stance control knee joint, knee ankle foot orthosis, quadriceps weakness, walking, energy consumption

Procedia PDF Downloads 125
16280 Analysis of the Result for the Accelerated Life Cycle Test of the Motor for Washing Machine by Using Acceleration Factor

Authors: Youn-Sung Kim, Jin-Ho Jo, Mi-Sung Kim, Jae-Kun Lee

Abstract:

Accelerated life cycle test is applied to various products or components in order to reduce the time of life cycle test in industry. It must be considered for many test conditions according to the product characteristics for the test and the selection of acceleration parameter is especially very important. We have carried out the general life cycle test and the accelerated life cycle test by applying the acceleration factor (AF) considering the characteristics of brushless DC (BLDC) motor for washing machine. The final purpose of this study is to verify the validity by analyzing the results of the general life cycle test and the accelerated life cycle test. It will make it possible to reduce the life test time through the reasonable accelerated life cycle test.

Keywords: accelerated life cycle test, reliability test, motor for washing machine, brushless dc motor test

Procedia PDF Downloads 611
16279 Vibration Imaging Method for Vibrating Objects with Translation

Authors: Kohei Shimasaki, Tomoaki Okamura, Idaku Ishii

Abstract:

We propose a vibration imaging method for high frame rate (HFR)-video-based localization of vibrating objects with large translations. When the ratio of the translation speed of a target to its vibration frequency is large, obtaining its frequency response in image intensities becomes difficult because one or no waves are observable at the same pixel. Our method can precisely localize moving objects with vibration by virtually translating multiple image sequences for pixel-level short-time Fourier transform to observe multiple waves at the same pixel. The effectiveness of the proposed method is demonstrated by analyzing several HFR videos of flying insects in real scenarios.

Keywords: HFR video analysis, pixel-level vibration source localization, short-time Fourier transform, virtual translation

Procedia PDF Downloads 108
16278 An Assessment of Factors Affecting the Cost and Time Performance of Subcontractors

Authors: Adedayo Jeremiah Adeyekun, Samuel Oluwagbemiga Ishola,

Abstract:

This paper is an assessment of factors influencing the cost and time performance of subcontractors and the need for effective performance of subcontractors at the project sites. The factors influencing the performance of subcontractors are grouped, similar to those identified with the project or an organization and on another hand, there are significant factors influencing the performance of the subcontractors. These factors incorporate management level leadership, time required to complete the project, profit, staff capability/expertise, reputation, installment method, organization history, and project procurement method strategy, security, bidding technique, insurance, bond and relationship with the major contractors. The factors influencing the management of subcontractors in building development projects includes performance of significant past projects, standard of workmanship, consistence with guidelines, regular payment to labourers, adherence to program, regularity and viability of communication with main contractor, adherence to subcontract necessities. Other factors comprise adherence to statutory environmental regulations, number of experienced sites administrative staff, inspection and maintenance of good workplace, number of artisans and workers, quality of as-built and shop drawings and ability to carry out the quantity of work and so on. This study also aimed to suggest a way forward to improve the performance of subcontractors which is the reason for exceeding budget at the project sites. To carry out this study, a questionnaire was drafted to derive information on the causes of low performance of subcontractors and the implication to cost.

Keywords: performance, contractor, subcontractors, construction

Procedia PDF Downloads 76
16277 Sync Consensus Algorithm: Trying to Reach an Agreement at Full Speed

Authors: Yuri Zinchenko

Abstract:

Recently, distributed storage systems have been used more and more in various aspects of everyday life. They provide such necessary properties as Scalability, Fault Tolerance, Durability, and others. At the same time, not only reliable but also fast data storage remains one of the most pressing issues in this area. That brings us to the consensus algorithm as one of the most important components that has a great impact on the functionality of a distributed system. This paper is the result of an analysis of several well-known consensus algorithms, such as Paxos and Raft. The algorithm it offers, called Sync, promotes, but does not insist on simultaneous writing to the nodes (which positively affects the overall writing speed) and tries to minimize the system's inactive time. This allows nodes to reach agreement on the system state in a shorter period, which is a critical factor for distributed systems. Also when developing Sync, a lot of attention was paid to such criteria as simplicity and intuitiveness, the importance of which is difficult to overestimate.

Keywords: sync, consensus algorithm, distributed system, leader-based, synchronization.

Procedia PDF Downloads 62
16276 Ultra-Wideband Antennas for Ultra-Wideband Communication and Sensing Systems

Authors: Meng Miao, Jeongwoo Han, Cam Nguyen

Abstract:

Ultra-wideband (UWB) time-domain impulse communication and radar systems use ultra-short duration pulses in the sub-nanosecond regime, instead of continuous sinusoidal waves, to transmit information. The pulse directly generates a very wide-band instantaneous signal with various duty cycles depending on specific usages. In UWB systems, the total transmitted power is spread over an extremely wide range of frequencies; the power spectral density is extremely low. This effectively results in extremely small interference to other radio signals while maintains excellent immunity to interference from these signals. UWB devices can therefore work within frequencies already allocated for other radio services, thus helping to maximize this dwindling resource. Therefore, impulse UWB technique is attractive for realizing high-data-rate, short-range communications, ground penetrating radar (GPR), and military radar with relatively low emission power levels. UWB antennas are the key element dictating the transmitted and received pulse shape and amplitude in both time and frequency domain. They should have good impulse response with minimal distortion. To facilitate integration with transmitters and receivers employing microwave integrated circuits, UWB antennas enabling direct integration are preferred. We present the development of two UWB antennas operating from 3.1 to 10.6 GHz and 0.3-6 GHz for UWB systems that provide direct integration with microwave integrated circuits. The operation of these antennas is based on the principle of wave propagation on a non-uniform transmission line. Time-domain EM simulation is conducted to optimize the antenna structures to minimize reflections occurring at the open-end transition. Calculated and measured results of these UWB antennas are presented in both frequency and time domains. The antennas have good time-domain responses. They can transmit and receive pulses effectively with minimum distortion, little ringing, and small reflection, clearly demonstrating the signal fidelity of the antennas in reproducing the waveform of UWB signals which is critical for UWB sensors and communication systems. Good performance together with seamless microwave integrated-circuit integration makes these antennas good candidates not only for UWB applications but also for integration with printed-circuit UWB transmitters and receivers.

Keywords: antennas, ultra-wideband, UWB, UWB communication systems, UWB radar systems

Procedia PDF Downloads 238
16275 Automatic Seizure Detection Using Weighted Permutation Entropy and Support Vector Machine

Authors: Noha Seddik, Sherine Youssef, Mohamed Kholeif

Abstract:

The automated epileptic seizure detection research field has emerged in the recent years; this involves analyzing the Electroencephalogram (EEG) signals instead of the traditional visual inspection performed by expert neurologists. In this study, a Support Vector Machine (SVM) that uses Weighted Permutation Entropy (WPE) as the input feature is proposed for classifying normal and seizure EEG records. WPE is a modified statistical parameter of the permutation entropy (PE) that measures the complexity and irregularity of a time series. It incorporates both the mapped ordinal pattern of the time series and the information contained in the amplitude of its sample points. The proposed system utilizes the fact that entropy based measures for the EEG segments during epileptic seizure are lower than in normal EEG.

Keywords: electroencephalogram (EEG), epileptic seizure detection, weighted permutation entropy (WPE), support vector machine (SVM)

Procedia PDF Downloads 372
16274 Time Efficient Color Coding for Structured-Light 3D Scanner

Authors: Po-Hao Huang, Pei-Ju Chiang

Abstract:

The structured light 3D scanner is commonly used for measuring the 3D shape of an object. Through projecting designed light patterns on the object, deformed patterns can be obtained and used for the geometric shape reconstruction. At present, Gray code is the most reliable and commonly used light pattern in the structured light 3D scanner. However, the trade-off between scanning efficiency and accuracy is a long-standing and challenging problem. The design of light patterns plays a significant role in the scanning efficiency and accuracy. Thereby, we proposed a novel encoding method integrating color information and Gray-code to improve the scanning efficiency. We will demonstrate that with the proposed method, the scanning time can be reduced to approximate half of the one needed by Gray-code without reduction of precision.

Keywords: gray-code, structured light scanner, 3D shape acquisition, 3D reconstruction

Procedia PDF Downloads 457
16273 Addiction Counseling Resources: A Qualitative Study

Authors: Cailyn Green

Abstract:

Substance use counselors have a variety of fast-paced tasks and responsibilities. Professional resources are designed to support professionals in making their job duties easier and less stressful. The purpose of this research was to identify what types of resources would support addiction counselors in performing their job duties. Counselors often must jump in and facilitate a group counseling session with little to no time for prep. This causes stress and creates pressure to come up with a clinical group activity in little time. The researcher utilized qualitative interviews focused on identifying what types of resources would support addiction counselors in doing their jobs easier and effectively. The researcher visited 23 different addiction counseling facilities seeking participants for the interviews. Altogether 15 interviews were collected across six different substance-use counseling facilities. The interviews guided the researcher toward creating an open education resource (OER) of group activities for addiction counselors to utilize.

Keywords: addiction, counseling, resources, OER, treatment

Procedia PDF Downloads 76
16272 Managing High-Performance Virtual Teams

Authors: Mehdi Rezai, Asghar Zamani

Abstract:

Virtual teams are a reality in today’s fast-paced world. With the possibility of commonly using common resources, an increase of inter-organizational projects, cooperation, outsourcing, and the increase in the number of people who work remotely or flexitime, an extensive and active presence of high-performance teams is a must. Virtual teams are a challenge by themselves. Their members remove the barriers of cultures, time regions and organizations, and they often communicate through electronic devices over considerable distances. Firstly, we examine the management of virtual teams by considering different issues such as cultural and personal diversities, communications and arrangement issues. Then we will examine individuals, processes and the existing tools in a team. The main challenge is managing high-performance virtual teams. First of all, we must examine the concept of performance. Then, we must focus on teams and the best methods of managing them. Constant improvement of performance, together with precisely regulating every individual’s method of working, increases the levels of performance in the course of time. High-performance teams exploit every issue as an opportunity for achieving high performance. And we know that doing projects with high performance is among every organization or team’s objectives. Performance could be measured using many criteria, among which carrying out projects in time, the satisfaction of stakeholders, and not exceeding budgets could be named. Elements such as clear objectives, clearly-defined roles and responsibilities, effective communications, and commitment to collaboration are essential to a team’s effectiveness. Finally, we will examine roles, systems, processes and will carry out a cause-and-effect analysis of different criteria in improving a team’s performance.

Keywords: virtual teams, performance, management, process, improvement, effectiveness

Procedia PDF Downloads 148
16271 Pentax Airway Scope Video Laryngoscope for Orotracheal Intubation in Children: A Randomized Controlled Trial

Authors: In Kyong Yi, Yun Jeong Chae, Jihoon Hwang, Sook-Young Lee, Jong-Yeop Kim

Abstract:

Background: Pentax airway scope (AWS) is a recently developed video laryngoscope for use in both normal and difficult airways, providing a good laryngeal view. The purpose of this randomized noninferior study was to evaluate the efficacy of the Pentax-AWS regarding intubation time, laryngeal view and ease of intubation in pediatric patients with normal airway, compared to Macintosh laryngoscope. Method: A total of 136 pediatric patients aged 1 to 10 with American Society of Anesthesiologists physical status I or II undergoing general anesthesia required orotracheal intubation were randomly allocated into two groups: Macintosh laryngoscope (n =68) and Pentax AWS (n=68). Anesthesia was induced with propofol, rocuronium, and sevoflurane. The primary outcome was intubation time. Cormack-Lehane laryngeal view grade, application of optimal laryngeal external manipulation (OELM), intubation difficulty scale (IDS), intubation failure rate and adverse events were also measured. Result: No significant difference was observed between the two groups regarding intubation time (Macintosh; 23[22-26] sec vs. Pentax; 23.5[22-27.75] sec, p=0.713). As for the laryngeal view grade, the Pentax group showed less number of grade 2a or higher grade cases compared to the Macintosh group (1/2a/2b/3; 52.9%/41.2%/4.4%/1.5% vs. 98.5%/1.5%/0%/0%, p=0.000). No optimal laryngeal external manipulation application was required in the Pentax group (38.2% vs. 0%, p=0.000). Intubation difficulty scale resulted in lower values for Pentax group (0 [0-2] vs. 0 [0-0.55], p=0.001). Failure rate was not different between the two groups (1.5% vs. 4.4%, p=0.619). Adverse event-wise, slightly higher incidence of bleeding (1.5% vs. 5.9%, p=0.172) and teeth injury (0% vs. 5.9%, p=0.042) occurred in the Pentax group. Conclusion: In conclusion, Pentax-AWS provided better laryngeal view, similar intubation time and similar success rate compared with Macintosh laryngoscope in children with normal airway. However, the risk of teeth injury might increase and warrant special attention.

Keywords: Pentax-AWS, pediatric, video laryngoscope, intubation

Procedia PDF Downloads 202
16270 A Laser Instrument Rapid-E+ for Real-Time Measurements of Airborne Bioaerosols Such as Bacteria, Fungi, and Pollen

Authors: Minghui Zhang, Sirine Fkaier, Sabri Fernana, Svetlana Kiseleva, Denis Kiselev

Abstract:

The real-time identification of bacteria and fungi is difficult because they emit much weaker signals than pollen. In 2020, Plair developed Rapid-E+, which extends abilities of Rapid-E to detect smaller bioaerosols such as bacteria and fungal spores with diameters down to 0.3 µm, while keeping the similar or even better capability for measurements of large bioaerosols like pollen. Rapid-E+ enables simultaneous measurements of (1) time-resolved, polarization and angle dependent Mie scattering patterns, (2) fluorescence spectra resolved in 16 channels, and (3) fluorescence lifetime of individual particles. Moreover, (4) it provides 2D Mie scattering images which give the full information on particle morphology. The parameters of every single bioaerosol aspired into the instrument are subsequently analysed by machine learning. Firstly, pure species of microbes, e.g., Bacillus subtilis (a species of bacteria), and Penicillium chrysogenum (a species of fungal spores), were aerosolized in a bioaerosol chamber for Rapid-E+ training. Afterwards, we tested microbes under different concentrations. We used several steps of data analysis to classify and identify microbes. All single particles were analysed by the parameters of light scattering and fluorescence in the following steps. (1) They were treated with a smart filter block to get rid of non-microbes. (2) By classification algorithm, we verified the filtered particles were microbes based on the calibration data. (3) The probability threshold (defined by the user) step provides the probability of being microbes ranging from 0 to 100%. We demonstrate how Rapid-E+ identified simultaneously microbes based on the results of Bacillus subtilis (bacteria) and Penicillium chrysogenum (fungal spores). By using machine learning, Rapid-E+ achieved identification precision of 99% against the background. The further classification suggests the precision of 87% and 89% for Bacillus subtilis and Penicillium chrysogenum, respectively. The developed algorithm was subsequently used to evaluate the performance of microbe classification and quantification in real-time. The bacteria and fungi were aerosolized again in the chamber with different concentrations. Rapid-E+ can classify different types of microbes and then quantify them in real-time. Rapid-E+ enables classifying different types of microbes and quantifying them in real-time. Rapid-E+ can identify pollen down to species with similar or even better performance than the previous version (Rapid-E). Therefore, Rapid-E+ is an all-in-one instrument which classifies and quantifies not only pollen, but also bacteria and fungi. Based on the machine learning platform, the user can further develop proprietary algorithms for specific microbes (e.g., virus aerosols) and other aerosols (e.g., combustion-related particles that contain polycyclic aromatic hydrocarbons).

Keywords: bioaerosols, laser-induced fluorescence, Mie-scattering, microorganisms

Procedia PDF Downloads 90
16269 Banking Crisis and Economic Effects of the Banking Crisis in Turkey

Authors: Sevilay Konya, Sadife Güngör, Zeynep Karaçor

Abstract:

Turkish economy is occurred depending on different factors from time to time and the banking crises of different magnitudes. Foremost among the factors which hinder the development of countries and societies- crises in the country's economy. Countries' economic growth rates affect inflation, unemployment and external trade. In this study, effect of November 2000, February 2001 and 2008 banking crisis on Turkey's economy and banking crisis will be examined and announced as conceptual. In this context, this study is investigates Turkey's GDP, inflation, unemployment and foreign trade figures. Turkey's economy affected have been identified from 2000 November 2001 February and 2008 banking crisis.

Keywords: banking crises, Turkey’s economy, economic effects, Turkey

Procedia PDF Downloads 295
16268 Cooperative Coevolution for Neuro-Evolution of Feed Forward Networks for Time Series Prediction Using Hidden Neuron Connections

Authors: Ravneil Nand

Abstract:

Cooperative coevolution uses problem decomposition methods to solve a larger problem. The problem decomposition deals with breaking down the larger problem into a number of smaller sub-problems depending on their method. Different problem decomposition methods have their own strengths and limitations depending on the neural network used and application problem. In this paper we are introducing a new problem decomposition method known as Hidden-Neuron Level Decomposition (HNL). The HNL method is competing with established problem decomposition method in time series prediction. The results show that the proposed approach has improved the results in some benchmark data sets when compared to the standalone method and has competitive results when compared to methods from literature.

Keywords: cooperative coevaluation, feed forward network, problem decomposition, neuron, synapse

Procedia PDF Downloads 335
16267 Control of Hybrid System Using Fuzzy Logic

Authors: Faiza Mahi, Fatima Debbat, Mohamed Fayçal Khelfi

Abstract:

This paper proposes a control approach using Fuzzy Lo system. More precisely, the study focuses on the improvement of users service in terms of analysis and control of a transportation system their waiting times in the exchange platforms of passengers. Many studies have been developed in the literature for such problematic, and many control tools are proposed. In this paper we focus on the use of fuzzy logic technique to control the system during its evolution in order to minimize the arrival gap of connected transportation means at the exchange points of passengers. An example of illustration is worked out and the obtained results are reported. an important area of research is the modeling and simulation ordering system. We describe an approach to analysis using Fuzzy Logic. The hybrid simulator developed in toolbox Matlab consists calculation of waiting time transportation mode.

Keywords: Fuzzy logic, Hybrid system, Waiting Time, Transportation system, Control

Procedia PDF Downloads 555
16266 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing

Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig

Abstract:

Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.

Keywords: empirical mode decomposition (EMD), mode mixing, sifting process, over-sifting

Procedia PDF Downloads 395
16265 Construction of Graph Signal Modulations via Graph Fourier Transform and Its Applications

Authors: Xianwei Zheng, Yuan Yan Tang

Abstract:

Classical window Fourier transform has been widely used in signal processing, image processing, machine learning and pattern recognition. The related Gabor transform is powerful enough to capture the texture information of any given dataset. Recently, in the emerging field of graph signal processing, researchers devoting themselves to develop a graph signal processing theory to handle the so-called graph signals. Among the new developing theory, windowed graph Fourier transform has been constructed to establish a time-frequency analysis framework of graph signals. The windowed graph Fourier transform is defined by using the translation and modulation operators of graph signals, following the similar calculations in classical windowed Fourier transform. Specifically, the translation and modulation operators of graph signals are defined by using the Laplacian eigenvectors as follows. For a given graph signal, its translation is defined by a similar manner as its definition in classical signal processing. Specifically, the translation operator can be defined by using the Fourier atoms; the graph signal translation is defined similarly by using the Laplacian eigenvectors. The modulation of the graph can also be established by using the Laplacian eigenvectors. The windowed graph Fourier transform based on these two operators has been applied to obtain time-frequency representations of graph signals. Fundamentally, the modulation operator is defined similarly to the classical modulation by multiplying a graph signal with the entries in each Fourier atom. However, a single Laplacian eigenvector entry cannot play a similar role as the Fourier atom. This definition ignored the relationship between the translation and modulation operators. In this paper, a new definition of the modulation operator is proposed and thus another time-frequency framework for graph signal is constructed. Specifically, the relationship between the translation and modulation operations can be established by the Fourier transform. Specifically, for any signal, the Fourier transform of its translation is the modulation of its Fourier transform. Thus, the modulation of any signal can be defined as the inverse Fourier transform of the translation of its Fourier transform. Therefore, similarly, the graph modulation of any graph signal can be defined as the inverse graph Fourier transform of the translation of its graph Fourier. The novel definition of the graph modulation operator established a relationship of the translation and modulation operations. The new modulation operation and the original translation operation are applied to construct a new framework of graph signal time-frequency analysis. Furthermore, a windowed graph Fourier frame theory is developed. Necessary and sufficient conditions for constructing windowed graph Fourier frames, tight frames and dual frames are presented in this paper. The novel graph signal time-frequency analysis framework is applied to signals defined on well-known graphs, e.g. Minnesota road graph and random graphs. Experimental results show that the novel framework captures new features of graph signals.

Keywords: graph signals, windowed graph Fourier transform, windowed graph Fourier frames, vertex frequency analysis

Procedia PDF Downloads 341
16264 Study of Biofuel Produced by Babassu Oil Fatty Acids Esterification

Authors: F. A. F. da Ponte, J. Q. Malveira, I. A. Maciel, M. C. G. Albuquerque

Abstract:

In this work aviation, biofuel production was studied by fatty acids (C6 to C16) esterification. The process variables in heterogeneous catalysis were evaluated using an experimental design. Temperature and reaction time were the studied parameters, and the methyl esters content was the response of the experimental design. An ion exchange resin was used as a heterogeneous catalyst. The process optimization was carried out using response surface methodology (RSM) and polynomial model of second order. Results show that the most influential variables on the linear coefficient of each effect studied were temperature and reaction time. The best result of methyl esters conversion in the experimental design was under the conditions: 10% wt of catalyst; 100 °C and 4 hours of reaction. The best-achieved conversion was 96.5% wt of biofuel.

Keywords: esterification, ion-exchange resins, response surface methodology, biofuel

Procedia PDF Downloads 495