Search results for: in no time
15804 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach
Authors: D. Tedesco, G. Feletti, P. Trucco
Abstract:
The present study aims to develop a Decision Support System (DSS) to support the operational decision of the Emergency Medical Service (EMS) regarding the assignment of medical emergency requests to Emergency Departments (ED). In the literature, this problem is also known as “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies are mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a request. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the transport time and release the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs, considering information relating to the subsequent phases of the process, such as the case-mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to evaluate different hospital selection policies. Therefore, the next steps of the research consisted of the development of a general simulation architecture, its implementation in the AnyLogic software and its validation on a realistic dataset. The hospital selection policy that produced the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, which is based on a retrospective estimate of the TTP, and a dynamic approach, which is based on a predictive estimate of the TTP determined with a constantly updated Winters model. Findings reveal that considering the minimization of TTP as a hospital selection policy raises several benefits. It allows to significantly reduce service throughput times in the ED with a minimum increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case-mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms of TTP estimation than a retrospective approach but entails a more difficult application. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.Keywords: discrete event simulation, emergency medical services, forecast model, hospital selection
Procedia PDF Downloads 9315803 Effect of Treated Peat Soil on the Plasticity Index and Hardening Time
Authors: Siti Nur Aida Mario, Farah Hafifee Ahmad, Rudy Tawie
Abstract:
Soil Stabilization has been widely implemented in the construction industry nowadays. Peat soil is well known as one of the most problematic soil among the engineers. The procedures need to take into account both physical and engineering properties of the stabilized peat soil. This paper presents a result of plasticity index and hardening of treated peat soil with various dosage of additives. In order to determine plasticity of the treated peat soil, atterberg limit test which comprises plastic limit and liquid limit test has been conducted. Determination of liquid limit in this experimental study is by using cone penetrometer. Vicat testing apparatus has been used in the hardening test which the penetration of the plunger is recorded every one hour for 24 hours. The results show that the plasticity index of peat soil stabilized with 80% FAAC and 20% OPC has the lowest plasticity index and recorded the fastest initial setting time. The significant of this study is to promote greener solution for future soil stabilization industry.Keywords: additives, hardening, peat soil, plasticity index, soil stabilization
Procedia PDF Downloads 32915802 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications
Authors: Atish Bagchi, Siva Chandrasekaran
Abstract:
Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning
Procedia PDF Downloads 15115801 Optimization, Yield and Chemical Composition of Essential Oil from Cymbopogon citratus: Comparative Study with Microwave Assisted Extraction and Hydrodistillation
Authors: Irsha Dhotre
Abstract:
Cymbopogon citratus is generally known as Indian Lemongrass and is widely applicable in the cosmetic, pharmaceutical, dairy puddings, and food industries. To enhance the quality of extraction, microwave-oven-aided hydro distillation processes were implemented. The basic parameter which influences the rate of extraction is considered, such as the temperature of extraction, the time required for extraction, and microwave-oven power applied. Locally available CKP 25 Cymbopogon citratus was used for the extraction of essential oil. Optimization of Extractions Parameters and full factorial Box–Behnken design (BBD) evaluated by using Design expert 13 software. The regression model revealed that the optimum parameters required for extractions are a temperature of 35℃, a time of extraction of 130 minutes, and microwave-oven power of 700 W. The extraction efficiency of yield is 4.76%. Gas Chromatography-Mass Spectroscopy (GC-MS) analysis confirmed the significant components present in the extraction of lemongrass oil.Keywords: Box–Behnken design, Cymbopogon citratus, hydro distillation, microwave-oven, response surface methodology
Procedia PDF Downloads 9515800 A Pole Radius Varying Notch Filter with Transient Suppression for Electrocardiogram
Authors: Ramesh Rajagopalan, Adam Dahlstrom
Abstract:
Noise removal techniques play a vital role in the performance of electrocardiographic (ECG) signal processing systems. ECG signals can be corrupted by various kinds of noise such as baseline wander noise, electromyographic interference, and power-line interference. One of the significant challenges in ECG signal processing is the degradation caused by additive 50 or 60 Hz power-line interference. This work investigates the removal of power line interference and suppression of transient response for filtering noise corrupted ECG signals. We demonstrate the effectiveness of Infinite Impulse Response (IIR) notch filter with time varying pole radius for improving the transient behavior. The temporary change in the pole radius of the filter diminishes the transient behavior. Simulation results show that the proposed IIR filter with time varying pole radius outperforms traditional IIR notch filters in terms of mean square error and transient suppression.Keywords: notch filter, ECG, transient, pole radius
Procedia PDF Downloads 38015799 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods
Authors: A. Senthil Kumar, V. Murali Bhaskaran
Abstract:
In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)
Procedia PDF Downloads 28815798 Effect of Carbon Nanotubes Functionalization with Nitrogen Groups on Pollutant Emissions in an Internal Combustion Engine
Authors: David Gamboa, Bernardo Herrera, Karen Cacua
Abstract:
Nanomaterials have been explored as alternatives to reduce particulate matter from diesel engines, which is one of the most common pollutants of the air in urban centers. However, the use of nanomaterials as additives for diesel has to overcome the instability of the dispersions to be considered viable for commercial use. In this work, functionalization of carbon nanotubes with amide groups was performed to improve the stability of these nanomaterials in a mix of 90% petroleum diesel and 10% palm oil biodiesel (B10) in concentrations of 50 and 100 ppm. The resulting nano fuel was used as the fuel for a stationary internal combustion engine, where the particulate matter, NOx, and CO were measured. The results showed that the use of amide groups significantly enhances the time for the carbon nanotubes to remain suspended in the fuel, and at the same time, these nanomaterials helped to reduce the particulate matter and NOx emissions. However, the CO emissions with nano fuel were higher than those ones with the combustion of B10. These results suggest that carbon nanotubes have thermal and catalytic effects on the combustion of B10.Keywords: carbon nanotubes, diesel, internal combustion engine, particulate matter
Procedia PDF Downloads 13115797 Physical Modeling of Woodwind Ancient Greek Musical Instruments: The Case of Plagiaulos
Authors: Dimitra Marini, Konstantinos Bakogiannis, Spyros Polychronopoulos, Georgios Kouroupetroglou
Abstract:
Archaemusicology cannot entirely depend on the study of the excavated ancient musical instruments as most of the time their condition is not ideal (i.e., missing/eroded parts) and moreover, because of the concern damaging the originals during the experiments. Researchers, in order to overcome the above obstacles, build replicas. This technique is still the most popular one, although it is rather expensive and time-consuming. Throughout the last decades, the development of physical modeling techniques has provided tools that enable the study of musical instruments through their digitally simulated models. This is not only a more cost and time-efficient technique but also provides additional flexibility as the user can easily modify parameters such as their geometrical features and materials. This paper thoroughly describes the steps to create a physical model of a woodwind ancient Greek instrument, Plagiaulos. This instrument could be considered as the ancestor of the modern flute due to the common geometry and air-jet excitation mechanism. Plagiaulos is comprised of a single resonator with an open end and a number of tone holes. The combination of closed and open tone holes produces the pitch variations. In this work, the effects of all the instrument’s components are described by means of physics and then simulated based on digital waveguides. The synthesized sound of the proposed model complies with the theory, highlighting its validity. Further, the synthesized sound of the model simulating the Plagiaulos of Koile (2nd century BCE) was compared with its replica build in our laboratory by following the scientific methodologies of archeomusicology. The aforementioned results verify that robust dynamic digital tools can be introduced in the field of computational, experimental archaemusicology.Keywords: archaeomusicology, digital waveguides, musical acoustics, physical modeling
Procedia PDF Downloads 11615796 Forecasting Stock Prices Based on the Residual Income Valuation Model: Evidence from a Time-Series Approach
Authors: Chen-Yin Kuo, Yung-Hsin Lee
Abstract:
Previous studies applying residual income valuation (RIV) model generally use panel data and single-equation model to forecast stock prices. Unlike these, this paper uses Taiwan longitudinal data to estimate multi-equation time-series models such as Vector Autoregressive (VAR), Vector Error Correction Model (VECM), and conduct out-of-sample forecasting. Further, this work assesses their forecasting performance by two instruments. In favor of extant research, the major finding shows that VECM outperforms other three models in forecasting for three stock sectors over entire horizons. It implies that an error correction term containing long-run information contributes to improve forecasting accuracy. Moreover, the pattern of composite shows that at longer horizon, VECM produces the greater reduction in errors, and performs substantially better than VAR.Keywords: residual income valuation model, vector error correction model, out of sample forecasting, forecasting accuracy
Procedia PDF Downloads 31715795 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs
Procedia PDF Downloads 16215794 Multiobjective Optimization of Wastwater Treatment by Electrochemical Process
Authors: Malek Bendjaballah, Hacina Saidi, Sarra Hamidoud
Abstract:
The aim of this study is to model and optimize the performance of a new electrocoagulation (E.C) process for the treatment of wastewater as well as the energy consumption in order to extrapolate it to the industrial scale. Through judicious application of an experimental design (DOE), it has been possible to evaluate the individual effects and interactions that have a significant influence on both objective functions (maximizing efficiency and minimizing energy consumption) by using aluminum electrodes as sacrificial anode. Preliminary experiments have shown that the pH of the medium, the applied potential and the treatment time with E.C are the main parameters. A factorial design 33 has been adopted to model performance and energy consumption. Under optimal conditions, the pollution reduction efficiency is 93%, combined with a minimum energy consumption of 2.60.10-3 kWh / mg-COD. The potential or current applied and the processing time and their interaction were the most influential parameters in the mathematical models obtained. The results of the modeling were also correlated with the experimental ones. The results offer promising opportunities to develop a clean process and inexpensive technology to eliminate or reduce wastewater,Keywords: electrocoagulation, green process, experimental design, optimization
Procedia PDF Downloads 9815793 Spatial Data Mining by Decision Trees
Authors: Sihem Oujdi, Hafida Belbachir
Abstract:
Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining
Procedia PDF Downloads 61615792 Magnetic versus Non-Magnetic Adatoms in Graphene Nanoribbons: Tuning of Spintronic Applications and the Quantum Spin Hall Phase
Authors: Saurabh Basu, Sudin Ganguly
Abstract:
Conductance in graphene nanoribbons (GNR) in presence of magnetic (for example, Iron) and non-magnetic (for example, Gold) adatoms are explored theoretically within a Kane-Mele model for their possible spintronic applications and topologically non-trivial properties. In our work, we have considered the magnetic adatoms to induce a Rashba spin-orbit coupling (RSOC) and an exchange bias field, while the non-magnetic ones induce an RSOC and an intrinsic spin-orbit (SO) coupling. Even though RSOC is present in both, they, however, represent very different physical situations, where the magnetic adatoms do not preserve the time reversal symmetry, while the non-magnetic case does. This has important implications on the topological properties. For example, the non-magnetic adatoms, for moderately strong values of SO, the GNR denotes a quantum spin Hall insulator as evident from a 2e²/h plateau in the longitudinal conductance and presence of distinct conducting edge states with an insulating bulk. Since the edge states are protected by time reversal symmetry, the magnetic adatoms in GNR yield trivial insulators and do not possess any non-trivial topological property. However, they have greater utility than the non-magnetic adatoms from the point of view of spintronic applications. Owing to the broken spatial symmetry induced by the presence of adatoms of either type, all the x, y and z components of the spin-polarized conductance become non-zero (only the y-component survives in pristine Graphene owing to a mirror symmetry present there) and hence become suitable for spintronic applications. However, the values of the spin polarized conductances are at least two orders of magnitude larger in the case of magnetic adatoms than their non-magnetic counterpart, thereby ensuring more efficient spintronic applications. Further the applications are tunable by altering the adatom densities.Keywords: magnetic and non-magnetic adatoms, quantum spin hall phase, spintronic applications, spin polarized conductance, time reversal symmetry
Procedia PDF Downloads 30415791 Method and System of Malay Traditional Women Apparel Pattern Drafting for Hazi Attire
Authors: Haziyah Hussin
Abstract:
Hazi Attire software is purposely designed to be used for pattern drafting of the Malay Traditional Women Apparel. It is software created using LISP Program that works under AutoCAD engine and able to draft various patterns for Malay women apparels from fitted, semi-fitted and loose silhouettes. It is fully automatic and the user can select styles from the menu on the screen and enter the measurements. Within five seconds patterns are ready to be printed and sewn. Hazi Attire is different from other programmes available in the market since it is fully automatic, user-friendly and able to print selected pattern chosen quickly and accurately. With this software (Hazi Attire), the selected styles can be generated the pattern according to made-to-measure or standard sizes. It would benefit the apparel industries by reducing manufacturing lead time and cycle time.Keywords: basic pattern, pattern drafting, toile, Malay traditional women apparel, the measurement parameters, fitted, semi-fitted and loose silhouette
Procedia PDF Downloads 27215790 Sulfamethaxozole (SMX) Removal by Microwave-Assisted Heterogenous Fenton Reaction Involving Synthetic Clay (LDHS)
Authors: Chebli Derradji, Abdallah Bouguettoucha, Zoubir Manaa, S. Nacef, A. Amrane
Abstract:
Antibiotics are major pollutants of wastewater not only due to their stability in biological systems, but also due to their impact on public health. Their degradation by means of hydroxyl radicals generated through the application of microwave in the presence of hydrogen peroxide and two solid catalysts, iron-based synthetic clay (LDHs) and goethite (FeOOH) have been examined. A drastic reduction of the degradation yield was observed above pH 4, and hence the optimal conditions were found to be a pH of 3, 0.1 g/L of clay, a somewhat low amount of H2O2 (1.74 mmol/L) and a microwave intensity of 850 W. It should be observed that to maintain an almost constant temperature, a cooling with cold water was always applied between two microwaves running; and hence the ratio between microwave heating time and cooling time was 1. The obtained SMX degradation was 98.8 ± 0.2% after 30 minutes of microwave treatment. It should be observed that in the absence of the solid catalyst, LDHs, no SMX degradation was observed. From this, the use of microwave in the presence of a solid source of iron (LDHs) appears to be an efficient solution for the treatment of wastewater containing SMX.Keywords: microwave, fenton, heterogenous fenton, degradation, oxidation, antibiotics
Procedia PDF Downloads 28115789 Examination of How Do Smart Watches Influence the Market of Luxury Watches with Particular Regard of the Buying-Reasons
Authors: Christopher Benedikt Jakob
Abstract:
In our current society, there is no need to take a look at the wristwatch to know the exact time. Smartphones, the watch in the car or the computer watch, inform us about the time too. Over hundreds of years, luxury watches have held a fascination for human beings. Consumers buy watches that cost thousands of euros, although they could buy much cheaper watches which also fulfill the function to indicate the correct time. This shows that the functional value has got a minor meaning with reference to the buying-reasons as regards luxury watches. For a few years, people have an increased demand to track data like their walking distance per day or to track their sleep for example. Smart watches enable consumers to get information about these data. There exists a trend that people intend to optimise parts of their social life, and thus they get the impression that they are able to optimise themselves as human beings. With the help of smart watches, they are able to optimise parts of their productivity and to realise their targets at the same time. These smart watches are also offered as luxury models, and the question is: how will customers of traditional luxury watches react? Therefore this study has the intention to give answers to the question why people are willing to spend an enormous amount of money on the consumption of luxury watches. The self-expression model, the relationship basis model, the functional benefit representation model and the means-end-theory are chosen as an appropriate methodology to find reasons why human beings purchase specific luxury watches and luxury smart watches. This evaluative approach further discusses these strategies concerning for example if consumers buy luxury watches/smart watches to express the current self or the ideal self and if human beings make decisions on expected results. The research critically evaluates that relationships are compared on the basis of their advantages. Luxury brands offer socio-emotional advantages like social functions of identification and that the strong brand personality of luxury watches and luxury smart watches helps customers to structure and retrieve brand awareness which simplifies the process of decision-making. One of the goals is to identify if customers know why they like specific luxury watches and dislike others although they are produced in the same country and cost comparable prices. It is very obvious that the market for luxury watches especially for luxury smart watches is changing way faster than it has been in the past. Therefore the research examines the market changing parameters in detail.Keywords: buying-behaviour, brand management, consumer, luxury watch, smart watch
Procedia PDF Downloads 21215788 Student Diversity in Higher Education: The Impact of Digital Elements on Student Learning Behavior and Subject-Specific Preferences
Authors: Pia Kastl
Abstract:
By combining face-to-face sessions with digital selflearning units, the learning process can be enhanced and learning success improved. Potentials of blended learning are the flexibility and possibility to get in touch with lecturers and fellow students face-toface. It also offers the opportunity to individualize and self-regulate the learning process. Aim of this article is to analyse how different learning environments affect students’ learning behavior and how digital tools can be used effectively. The analysis also considers the extent to which the field of study affects the students’ preferences. Semi-structured interviews were conducted with students from different disciplines at two German universities (N= 60). The questions addressed satisfaction and perception of online, faceto-face and blended learning courses. In addition, suggestions for improving learning experience and the use of digital tools in the different learning environments were surveyed. The results show that being present on campus has a positive impact on learning success and online teaching facilitates flexible learning. Blended learning can combine the respective benefits, although one challenge is to keep the time investment within reasonable limits. The use of digital tools differs depending on the subject. Medical students are willing to use digital tools to improve their learning success and voluntarily invest more time. Students of the humanities and social sciences, on the other hand, are reluctant to invest additional time. They do not see extra study material as an additional benefit their learning success. This study illustrates how these heterogenous demands on learning environments can be met. In addition, potential for improvement will be identified in order to foster both learning process and learning success. Learning environments can be meaningfully enriched with digital elements to address student diversity in higher education.Keywords: blended learning, higher education, diversity, learning styles
Procedia PDF Downloads 7115787 Study the effect of bulk traps on Solar Blind Photodetector Based on an IZTO/β Ga2O3/ITO Schottky Diode
Authors: Laboratory of Semiconducting, Metallic Materials (LMSM) Biskra Algeria
Abstract:
InZnSnO2 (IZTO)/β-Ga2O3 Schottky solar barrier photodetector (PhD) exposed to 255 nm was simulated and compared to the measurement. Numerical simulations successfully reproduced the photocurrent at reverse bias and response by taking into account several factors, such as conduction mechanisms and material parameters. By adopting reducing the density of the trap as an improvement. The effect of reducing the bulk trap densities on the photocurrent, response, and time-dependent (continuous conductivity) was studied. As the trap density decreased, the photocurrent increased. The response was 0.04 A/W for the low Ga2O3 trap density. The estimated decay time for the lowest intensity ET (0.74, 1.04 eV) is 0.05 s and is shorter at ∼0.015 s for ET (0.55 eV). This indicates that the shallow traps had the dominant effect (ET = 0.55 eV) on the continuous photoconductivity phenomenon. Furthermore, with decreasing trap densities, this PhD can be considered as a self-powered solar-blind photodiode (SBPhD).Keywords: IZTO/β-Ga2O3, self-powered solar-blind photodetector, numerical simulation, bulk traps
Procedia PDF Downloads 8815786 Effect of Thermal Treatment on Phenolic Content, Antioxidant, and Alpha-Amylase Inhibition Activities of Moringa stenopetala Leaves
Authors: Daniel Assefa, Engeda Dessalegn, Chetan Chauhan
Abstract:
Moringa stenopetala is a socioeconomic valued tree that is widely available and cultivated in the Southern part of Ethiopia. The leaves have been traditionally used as a food source with high nutritional and medicinal values. The present work was carried out to evaluate the effect of thermal treatment on the total phenolic content, antioxidant and alpha-amylase inhibition activities of aqueous leaf extracts during maceration and different decoction time interval (5, 10 and 15 min). The total phenolic content was determined by the Folin-ciocalteu methods whereas antioxidant activities were determined by 2,2-diphenyl-1-picryl-hydrazyl(DPPH) radical scavenging, reducing power and ferrous ion chelating assays and alpha-amylase inhibition activity was determined using 3,5-dinitrosalicylic acid method. Total phenolic content ranged from 34.35 to 39.47 mgGAE/g. Decoction for 10 min extract showed ferrous ion chelating (92.52), DPPH radical scavenging (91.52%), alpha-amylase inhibition (69.06%) and ferric reducing power (0.765), respectively. DPPH, reducing power and alpha-amylase inhibition activities showed positive linear correlation (R2=0.853, R2= 0.857 and R2=0.930), respectively with total phenolic content but ferrous ion chelating activity was found to be weakly correlated (R2=0.481). Based on the present investigation, it could be concluded that major loss of total phenolic content, antioxidant and alpha-amylase inhibition activities of the crude leaf extracts of Moringa stenopetala leaves were observed at decoction time for 15 min. Therefore, to maintain the total phenolic content, antioxidant, and alpha-amylase inhibition activities of leaves, cooking practice should be at the optimum decoction time (5-10 min).Keywords: alpha-amylase inhibition, antioxidant, Moringa stenopetala, total phenolic content
Procedia PDF Downloads 36315785 Empirical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;
Procedia PDF Downloads 8315784 Structuring of Multilayer Aluminum Nickel by Lift-off Process Using Cheap Negative Resist
Authors: Muhammad Talal Asghar
Abstract:
The lift-off technique of the photoresist for metal patterning in integrated circuit (IC) packaging has been widely utilized in the field of microelectromechanical systems and semiconductor component manufacturing. The main advantage lies in cost-saving, reduction in complexity, and maturity of the process. The selection of photoresist depends upon many factors such as cost, the thickness of the resist, comfortable and valuable parameters extraction. In the present study, an extremely cheap dry film photoresist E8015 of thickness 38-micrometer is processed for the first time for edge profiling, according to the author's best knowledge. Successful extraction of the helpful parameter range for resist processing is performed. An undercut angle of 66 to 73 degrees is realized by parameter variation like exposure energy and development time. Finally, 10-micrometer thick metallic multilayer aluminum nickel is lifted off on the plain silicon wafer. Possible applications lie in controlled self-propagating reactions within structured metallic multilayer that may be utilized for IC packaging in the future.Keywords: lift-off, IC packaging, photoresist, multilayer
Procedia PDF Downloads 21315783 Gearbox Defect Detection in the Semi Autogenous Mills Using the Vibration Analysis Technique
Authors: Mostafa Firoozabadi, Alireza Foroughi Nematollahi
Abstract:
Semi autogenous mills are designed for grinding or primary crushed ore, and are the most widely used in concentrators globally. Any defect occurrence in semi autogenous mills can stop the production line. A Gearbox is a significant part of a rotating machine or a mill, so, the gearbox monitoring is a necessary process to prevent the unwanted defects. When a defect happens in a gearbox bearing, this defect can be transferred to the other parts of the equipment like inner ring, outer ring, balls, and the bearing cage. Vibration analysis is one of the most effective and common ways to detect the bearing defects in the mills. Vibration signal in a mill can be made by different parts of the mill including electromotor, pinion girth gear, different rolling bearings, and tire. When a vibration signal, made by the aforementioned parts, is added to the gearbox vibration spectrum, an accurate and on time defect detection in the gearbox will be difficult. In this paper, a new method is proposed to detect the gearbox bearing defects in the semi autogenous mill on time and accurately, using the vibration signal analysis method. In this method, if the vibration values are increased in the vibration curve, the probability of defect occurrence is investigated by comparing the equipment vibration values and the standard ones. Then, all vibration frequencies are extracted from the vibration signal and the equipment defect is detected using the vibration spectrum curve. This method is implemented on the semi autogenous mills in the Golgohar mining and industrial company in Iran. The results show that the proposed method can detect the bearing looseness on time and accurately. After defect detection, the bearing is opened before the equipment failure and the predictive maintenance actions are implemented on it.Keywords: condition monitoring, gearbox defects, predictive maintenance, vibration analysis
Procedia PDF Downloads 46715782 The Use of Learning Management Systems during Emerging the Tacit Knowledge
Authors: Ercan Eker, Muhammer Karaman, Akif Aslan, Hakan Tanrikuluoglu
Abstract:
Deficiency of institutional memory and knowledge management can result in information security breaches, loss of prestige and trustworthiness and the worst the loss of know-how and institutional knowledge. Traditional learning management within organizations is generally handled by personal efforts. That kind of struggle mostly depends on personal desire, motivation and institutional belonging. Even if an organization has highly motivated employees at a certain time, the institutional knowledge and memory life cycle will generally remain limited to these employees’ spending time in this organization. Having a learning management system in an organization can sustain the institutional memory, knowledge and know-how in the organization. Learning management systems are much more needed especially in public organizations where the job rotation is frequently seen and managers are appointed periodically. However, a learning management system should not be seen as an organizations’ website. It is a more comprehensive, interactive and user-friendly knowledge management tool for organizations. In this study, the importance of using learning management systems in the process of emerging tacit knowledge is underlined.Keywords: knowledge management, learning management systems, tacit knowledge, institutional memory
Procedia PDF Downloads 38115781 Benchmarking of Pentesting Tools
Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.Keywords: cybersecurity, IDS, security, web scanners, web vulnerabilities
Procedia PDF Downloads 31915780 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation
Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner
Abstract:
This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.Keywords: business valuation, corporate finance, digitisation, disruption
Procedia PDF Downloads 13515779 Biodiversity and Distribution of Tettigonioidea, Ensifera of Pakistan
Authors: Riffat Sultana Pathan, Waheed Ali Panhwar, Muhammad Saeed Wagan
Abstract:
Tettigonioidea are phytophagous insects damaging agricultural crops, forest, fruit orchards, berry shrubs, and grasses. The material was collected from different agricultural fields of rice, sugarcane, wheat, maize surrounding by different grasses. Beside this, forest, hilly areas, semi-desert and desert regions were also inspected time to time. All material was captured, killed and stored by using the standard entomological method. As a result of extensive survey fair numbers were captured from the different climatic zone of country. Seven sub-families of Tettigonioidea viz: Pseudophyllinae, Phaneropterinae, Conocephalinae, Tettigoniinae, Hexacentrinae, Mecopodinae and Decticinae came in collection. This fauna contributes 29 new records to Pakistan and 5 new species to science. Beside this, a brief description of each supra-generic category of Tettigonioidea along with photographs and synonymy is also documented. In addition to this, detailed list of host plants from Pakistan was also composed. This study provides important data for Integrated Pest Management (IPM) of Tettigonioidea biodiversity conservation and grassland restoration in Pakistan.Keywords: agriculture, conocephalinae, pest, phaneropterinae, tettigoniidae
Procedia PDF Downloads 34915778 When Your Change The Business Model ~ You Change The World
Authors: H. E. Amb. Terry Earthwind Nichols
Abstract:
Over the years Ambassador Nichols observed that successful companies all have one thing in common - belief in people. His observations of people in many companies, industries, and countries have also concluded one thing - groups of achievers far exceed the expectations and timelines of their superiors. His experience with achieving this has brought forth a model for the 21st century that will not only exceed expectations of companies, but it will also set visions for the future of business globally. It is time for real discussion around the future of work and the business model that will set the example for the world. Methodologies: In-person observations over 40 years – Ambassador Nichols present during the observations. Audio-visual observations – TV, Cinema, social media (YouTube, etc.), various news outlet Reading the autobiography of some of successful leaders over the last 75 years that lead their companies from a distinct perspective your people are your commodity. Major findings: People who believe in the leader’s vision for the company so much so that they remain excited about the future of the company and want to do anything in their power to ethically achieve that vision. People who are achieving regularly in groups, division, companies, etcetera: Live more healthfully lowering both sick time off and on-the-job accidents. Cannot wait to physically get to work as much as they can to feed off the high energy present in these companies. They are fully respected and supported resulting in near zero attrition. Simply put – they do not “Burn Out”. Conclusion: To the author’s best knowledge, 20th century practices in business are no longer valid and people are not going to work in those environments any longer. The average worker in the post-covid world is better educated than 50 years ago and most importantly, they have real-time information about any subject and can stream injustices as they happen. The Consortium Model is just the model for the evolution of both humankind and business in the 21st century.Keywords: business model, future of work, people, paradigm shift, business management
Procedia PDF Downloads 8115777 Experimental Approach for Determining Hemi-Anechoic Characteristics of Engineering Acoustical Test Chambers
Authors: Santiago Montoya-Ospina, Raúl E. Jiménez-Mejía, Rosa Elvira Correa Gutiérrez
Abstract:
An experimental methodology is proposed for determining hemi-anechoic characteristics of an engineering acoustic room built at the facilities of Universidad Nacional de Colombia to evaluate the free-field conditions inside the chamber. Experimental results were compared with theoretical ones in both, the source and the sound propagation inside the chamber. Acoustic source was modeled by using monopole radiation pattern from punctual sources and the image method was considered for dealing with the reflective plane of the room, that means, the floor without insulation. Finite-difference time-domain (FDTD) method was implemented to calculate the sound pressure value at every spatial point of the chamber. Comparison between theoretical and experimental data yields to minimum error, giving satisfactory results for the hemi-anechoic characterization of the chamber.Keywords: acoustic impedance, finite-difference time-domain, hemi-anechoic characterization
Procedia PDF Downloads 16415776 The Positive Effects of Top-Sharing: A Case Study
Authors: Maike Andresen, Georg Dochtmann
Abstract:
Due to political, social, and societal changes in labor organization, top-sharing, defined as job-sharing in leading positions, becomes more important in HRM. German companies are looking for practical and economically meaningful solutions that allow to enduringly increase women’s ratio in management, not only because of a recently implemented quota. Furthermore, supporting employees in achieving work-life balance is perceived as an important goal for a sustainable HRM to gain competitive advantage. Top-sharing is seen as being suitable to reach both goals. To evaluate determinants leading to effective top-sharing, a case study of a newly implemented top-sharing tandem in a large German enterprise was conducted over a period of 15 months. In this company, a full leadership position was split into two 60%-part-time positions held by an experienced female leader in her late career and a female college who took over her first leadership position (mid-career). We assumed a person-person fit in terms of a match of the top sharing partners’ personality profiles (Big Five) and their leadership motivations to be important prerequisites for an effective collaboration between them. We evaluated the person-person fit variables once before the tandem started to work. Both leaders were expected to learn from each other (mentoring, competency development). On an operational level, they were supposed to lead together the same employees in an effective manner (leader-member exchange), presupposing an effective cooperation between both (handing over information). To see developments over time, these processes were evaluated three times over the span of the project. Top-Sharing and the underlined processes are expected to positively influence the tandem’s performance which has been evaluated twice, at the beginning and the end of the project, to assess its development over time as well. The evaluation of the personality and the basic motives suggests that both executives can be a successful top-sharing tandem. The competency evaluations (supervisor as well as self-assessment) increased over the time span. Although the top sharing tandem worked on equal terms, they implemented rather classical than peer-mentoring due to different career ambitions of the tandem partners. Thus, opportunities were not used completely. Team-member exchange scores proved the good cooperation between the top-sharers. Although the employees did not evaluate the leader-member-exchange between them and the two leaders of the tandem homogeneously, the top-sharing tandem itself did not have the impression that the employees’ task performance depended on whom of the tandem was responsible for the task. Furthermore, top-sharing did not negatively influence the performance of both leaders. During qualitative interviews with the top-sharers and their team, we found that the top-sharers could focus more easily on their tasks. The results suggest positive outcomes of top-sharing (e.g. competency improvement, learning from each other through mentoring). Top-Sharing does not hamper performance. Thus, further research and practical implementations are suggested. As part-time jobs are still more often a female solution to increase their work-life- and work-family-balance, top-sharing may be a suitable solution to increase the woman’s ratio in leadership positions as well as to sustainable increase work-life-balance of executives.Keywords: mentoring, part-time leadership, top-sharing, work-life-balance
Procedia PDF Downloads 26615775 Investigating Acute and Chronic Pain after Bariatric Surgery
Authors: Patti Kastanias, Wei Wang, Karyn Mackenzie, Sandra Robinson, Susan Wnuk
Abstract:
Obesity is a worldwide epidemic and is recognized as a chronic disease. Pain in the obese individual is a multidimensional issue. An increase in BMI is positively correlated with pain incidence and severity, especially in central obesity where individuals are twice as likely to have chronic pain. Both obesity and chronic pain are also associated with mood disorders. Pain is worse among obese individuals with depression and anxiety. Bariatric surgery provides patients with an effective solution for long-term weight loss and associated health problems. However, not much is known about acute and chronic pain after bariatric surgery and its contributing factors, including mood disorders. Nurse practitioners (NPs) at one large multidisciplinary bariatric surgery centre led two studies to examine acute and chronic pain and pain management over time after bariatric surgery. The purpose of the initial study was to examine the incidence and severity of acute and chronic pain after bariatric surgery. The aim of the secondary study was to further examine chronic pain, specifically looking at psychological factors that influence severity or incidence of both neuropathic and somatic pain as well as changes in opioid use. The initial study was a prospective, longitudinal study where patients having bariatric surgery at one surgical center were followed up to 6 months postop. Data was collected at 7 time points using validated instruments for pain severity, pain interference, and patient satisfaction. In the second study, subjects were followed longitudinally starting preoperatively and then at 6 months and 1 year postoperatively to capture changes in chronic pain and influencing variables over time. Valid and reliable instruments were utilized for all major study outcomes. In the first study, there was a trend towards decreased acute post-operative pain over time. The incidence and severity of chronic pain was found to be significantly reduced at 6 months post bariatric surgery. Interestingly, interference of chronic pain in daily life such as normal work, mood, and walking ability was significantly improved at 6 months postop however; this was not the case with sleep. Preliminary results of the secondary study indicate that pain severity, pain interference, anxiety and depression are significantly improved at 6 months postoperatively. In addition, preoperative anxiety, depression and emotional regulation were predictive of pain interference, but not pain severity. The results of our regression analyses provide evidence for the impact of pre-existing psychological factors on pain, particularly anxiety in obese populations.Keywords: bariatric surgery, mood disorders, obesity, pain
Procedia PDF Downloads 305