Search results for: agitation time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18244

Search results for: agitation time

15694 A Framework for Systemically Understanding and Increasing Compliance with Water Regulation in Time Limited and Uncertain Contexts

Authors: Luisa Perez-Mujica

Abstract:

Traditionally, non-compliance in water regulation has been understood to be attributable to lack of information or knowledge of regulations. In other words, it is confusing behavioural change and education with communication or regulations. However, compliance is a complex response to water regulation factors including 1) knowledge and understanding of regulations; 2) perception that resources are overregulated; 3) presence of regulatory officers in the field; 4) accurate communication of what is being protected; 5) time lag between behavioral change projects and observation of outcomes and 6) how success of behavioral change is measured and evaluated. This paper presents a framework for designing education and behavioral change projects by understanding non-compliance in terms of the interaction of its factors, including a process for prioritizing projects, actions, evaluation and monitoring of outcomes. By taking a systemic approach to compliance, a more directed type of actions can be efficiently identified and prioritized, preventing the reactive nature of education and behavioral change projects.

Keywords: water regulation, compliance, behaviour change, systems thinking

Procedia PDF Downloads 240
15693 Pathway to Sustainable Shipping: Electric Ships

Authors: Wei Wang, Yannick Liu, Lu Zhen, H. Wang

Abstract:

Maritime transport plays an important role in global economic development but also inevitably faces increasing pressures from all sides, such as ship operating cost reduction and environmental protection. An ideal innovation to address these pressures is electric ships. The electric ship is in the early stage. Considering the special characteristics of electric ships, i.e., travel range limit, to guarantee the efficient operation of electric ships, the service network needs to be re-designed carefully. This research designs a cost-efficient and environmentally friendly service network for electric ships, including the location of charging stations, charging plan, route planning, ship scheduling, and ship deployment. The problem is formulated as a mixed-integer linear programming model with the objective of minimizing total cost comprised of charging cost, the construction cost of charging stations, and fixed cost of ships. A case study using data of the shipping network along the Yangtze River is conducted to evaluate the performance of the model. Two operating scenarios are used: an electric ship scenario where all the transportation tasks are fulfilled by electric ships and a conventional ship scenario where all the transportation tasks are fulfilled by fuel oil ships. Results unveil that the total cost of using electric ships is only 42.8% of using conventional ships. Using electric ships can reduce 80% SOx, 93.47% NOx, 89.47% PM, and 42.62% CO2, but will consume 2.78% more time to fulfill all the transportation tasks. Extensive sensitivity analyses are also conducted for key operating factors, including battery capacity, charging speed, volume capacity, and a service time limit of transportation task. Implications from the results are as follows: 1) it is necessary to equip the ship with a large capacity battery when the number of charging stations is low; 2) battery capacity will influence the number of ships deployed on each route; 3) increasing battery capacity will make the electric ship more cost-effective; 4) charging speed does not affect charging amount and location of charging station, but will influence the schedule of ships on each route; 5) there exists an optimal volume capacity, at which all costs and total delivery time are lowest; 6) service time limit will influence ship schedule and ship cost.

Keywords: cost reduction, electric ship, environmental protection, sustainable shipping

Procedia PDF Downloads 78
15692 Sustainable Traditional Architecture and Urban Planning in Hot-Arid Climate of Iran

Authors: Farnaz Nazem

Abstract:

The aim of sustainable architecture is to design buildings with the least adverse effects on the environment and provide better conditions for people. What building forms make the best use of land? This question was addressed in the late 1960s at the center of Land Use and Built Form Studies in Cambridge. This led to a number of influential papers which had a great influence on the practice of urban design. This paper concentrates on the results of sustainability caused by climatic conditions in Iranian traditional architecture in hot-arid regions. As people spent a significant amount of their time in houses, it was very important to have such houses to fulfill their needs physically and spiritually as well as satisfying their cultural and religious aspects of their lifestyles. In a vast country such as Iran with different climatic zones, traditional builders have presented series of logical solutions for human comfort. These solutions have been able to response to the environmental problems for a long period of time. As a result, by considering the experience in traditional architecture of hot–arid climate in Iran, it is possible to attain sustainable architecture.

Keywords: hot-arid climate, Iran, sustainable traditional architecture, urban planning

Procedia PDF Downloads 472
15691 The Relationship among Lifestyles, Accompany Forms, and Children’s Capability to Solve Problems of Modern Families

Authors: Tien-Ling Yeh, Jo-Han Chan

Abstract:

The percentage of dual-earner couples has become higher and higher each year. Family lifestyles in Taiwan have also been changing. This fact reflects the importance of family communication and parent-child relationship. This study aimed to explore the influences of family lifestyles and accompany forms on children’s capability to solve problems. The research process included two phases: (1) literature review, to explore the characteristics of children’s capability to solve problems and methods to measure this capability; and (2) questionnaire analyses, to explore the influences of lifestyles and accompany time and forms of modern families on their children’s capability to solve problems. The questionnaires were issued in October and November, 2016. A total of 300 questionnaires were retrieved, among which 250 were valid. The findings are summarized below: -The linguistic performances of the children from families of the busy and haggling lifestyle or the intermittent childcare lifestyle were rather good. Besides being interested in learning, these children could solve problems or difficulties independently. -The capability to ‘analyze problems’ of children from families with accompanying time during 19:00-19:30 (family dinner time) or 22:00-23:30 (before bedtime) was good. When facing a complex problem, these children could identify the most important factor in the problem. When seeing a problem, they would first look for the cause. If they encountered a bottleneck while solving a problem, they would review the context of the problem and related conditions to come up with another solution. -According to the literature, learning toys with numbers and symbols to learn to read can help develop children’s logic thinking, which is helpful to solve problems. Interestingly, some study suggested that children playing with fluid constructive toys are less likely to give up what they are doing and more likely to identify problems in their daily life. Some of them can even come up with creative and effective solutions.

Keywords: accompany, lifestyle, parent-child, problem-solving

Procedia PDF Downloads 117
15690 Behavioral Finance: Anomalies at Real Markets, Weekday Effect

Authors: Vera Jancurova

Abstract:

The financial theory is dominated by the believe that weekday effect has disappeared from current markets. The purpose of this article is to study anomalies, especially weekday effect, at real markets that disrupt the efficiency of financial markets. The research is based on the analyses of historical daily exchange rates of significant world indices to determine the presence of weekday effects on financial markets. The methodology used for the study is based on the analyzes of daily averages of particular indexes for different time periods. Average daily gains were analyzed for their whole time interval and then for particular five and ten years periods with the aim to detect the presence on current financial markets. The results confirm the presence of weekday effect at the most significant indices - for example: Nasdaq, S & P 500, FTSE 100 and the Hang Seng. It was confirmed that in the last ten years, the weekend effect disappeared from financial markets. However in last year’s the indicators show that weekday effect is coming back. The study shows that weekday effect has to be taken into consideration on financial markets, especially in the past years.

Keywords: indices, anomalies, behavioral finance, weekday effect

Procedia PDF Downloads 339
15689 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support the operational decision of the Emergency Medical Service (EMS) regarding the assignment of medical emergency requests to Emergency Departments (ED). In the literature, this problem is also known as “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies are mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a request. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the transport time and release the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs, considering information relating to the subsequent phases of the process, such as the case-mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to evaluate different hospital selection policies. Therefore, the next steps of the research consisted of the development of a general simulation architecture, its implementation in the AnyLogic software and its validation on a realistic dataset. The hospital selection policy that produced the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, which is based on a retrospective estimate of the TTP, and a dynamic approach, which is based on a predictive estimate of the TTP determined with a constantly updated Winters model. Findings reveal that considering the minimization of TTP as a hospital selection policy raises several benefits. It allows to significantly reduce service throughput times in the ED with a minimum increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case-mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms of TTP estimation than a retrospective approach but entails a more difficult application. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: discrete event simulation, emergency medical services, forecast model, hospital selection

Procedia PDF Downloads 90
15688 Effect of Treated Peat Soil on the Plasticity Index and Hardening Time

Authors: Siti Nur Aida Mario, Farah Hafifee Ahmad, Rudy Tawie

Abstract:

Soil Stabilization has been widely implemented in the construction industry nowadays. Peat soil is well known as one of the most problematic soil among the engineers. The procedures need to take into account both physical and engineering properties of the stabilized peat soil. This paper presents a result of plasticity index and hardening of treated peat soil with various dosage of additives. In order to determine plasticity of the treated peat soil, atterberg limit test which comprises plastic limit and liquid limit test has been conducted. Determination of liquid limit in this experimental study is by using cone penetrometer. Vicat testing apparatus has been used in the hardening test which the penetration of the plunger is recorded every one hour for 24 hours. The results show that the plasticity index of peat soil stabilized with 80% FAAC and 20% OPC has the lowest plasticity index and recorded the fastest initial setting time. The significant of this study is to promote greener solution for future soil stabilization industry.

Keywords: additives, hardening, peat soil, plasticity index, soil stabilization

Procedia PDF Downloads 329
15687 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 150
15686 Optimization, Yield and Chemical Composition of Essential Oil from Cymbopogon citratus: Comparative Study with Microwave Assisted Extraction and Hydrodistillation

Authors: Irsha Dhotre

Abstract:

Cymbopogon citratus is generally known as Indian Lemongrass and is widely applicable in the cosmetic, pharmaceutical, dairy puddings, and food industries. To enhance the quality of extraction, microwave-oven-aided hydro distillation processes were implemented. The basic parameter which influences the rate of extraction is considered, such as the temperature of extraction, the time required for extraction, and microwave-oven power applied. Locally available CKP 25 Cymbopogon citratus was used for the extraction of essential oil. Optimization of Extractions Parameters and full factorial Box–Behnken design (BBD) evaluated by using Design expert 13 software. The regression model revealed that the optimum parameters required for extractions are a temperature of 35℃, a time of extraction of 130 minutes, and microwave-oven power of 700 W. The extraction efficiency of yield is 4.76%. Gas Chromatography-Mass Spectroscopy (GC-MS) analysis confirmed the significant components present in the extraction of lemongrass oil.

Keywords: Box–Behnken design, Cymbopogon citratus, hydro distillation, microwave-oven, response surface methodology

Procedia PDF Downloads 94
15685 A Pole Radius Varying Notch Filter with Transient Suppression for Electrocardiogram

Authors: Ramesh Rajagopalan, Adam Dahlstrom

Abstract:

Noise removal techniques play a vital role in the performance of electrocardiographic (ECG) signal processing systems. ECG signals can be corrupted by various kinds of noise such as baseline wander noise, electromyographic interference, and power-line interference. One of the significant challenges in ECG signal processing is the degradation caused by additive 50 or 60 Hz power-line interference. This work investigates the removal of power line interference and suppression of transient response for filtering noise corrupted ECG signals. We demonstrate the effectiveness of Infinite Impulse Response (IIR) notch filter with time varying pole radius for improving the transient behavior. The temporary change in the pole radius of the filter diminishes the transient behavior. Simulation results show that the proposed IIR filter with time varying pole radius outperforms traditional IIR notch filters in terms of mean square error and transient suppression.

Keywords: notch filter, ECG, transient, pole radius

Procedia PDF Downloads 377
15684 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 286
15683 Effect of Carbon Nanotubes Functionalization with Nitrogen Groups on Pollutant Emissions in an Internal Combustion Engine

Authors: David Gamboa, Bernardo Herrera, Karen Cacua

Abstract:

Nanomaterials have been explored as alternatives to reduce particulate matter from diesel engines, which is one of the most common pollutants of the air in urban centers. However, the use of nanomaterials as additives for diesel has to overcome the instability of the dispersions to be considered viable for commercial use. In this work, functionalization of carbon nanotubes with amide groups was performed to improve the stability of these nanomaterials in a mix of 90% petroleum diesel and 10% palm oil biodiesel (B10) in concentrations of 50 and 100 ppm. The resulting nano fuel was used as the fuel for a stationary internal combustion engine, where the particulate matter, NOx, and CO were measured. The results showed that the use of amide groups significantly enhances the time for the carbon nanotubes to remain suspended in the fuel, and at the same time, these nanomaterials helped to reduce the particulate matter and NOx emissions. However, the CO emissions with nano fuel were higher than those ones with the combustion of B10. These results suggest that carbon nanotubes have thermal and catalytic effects on the combustion of B10.

Keywords: carbon nanotubes, diesel, internal combustion engine, particulate matter

Procedia PDF Downloads 128
15682 Physical Modeling of Woodwind Ancient Greek Musical Instruments: The Case of Plagiaulos

Authors: Dimitra Marini, Konstantinos Bakogiannis, Spyros Polychronopoulos, Georgios Kouroupetroglou

Abstract:

Archaemusicology cannot entirely depend on the study of the excavated ancient musical instruments as most of the time their condition is not ideal (i.e., missing/eroded parts) and moreover, because of the concern damaging the originals during the experiments. Researchers, in order to overcome the above obstacles, build replicas. This technique is still the most popular one, although it is rather expensive and time-consuming. Throughout the last decades, the development of physical modeling techniques has provided tools that enable the study of musical instruments through their digitally simulated models. This is not only a more cost and time-efficient technique but also provides additional flexibility as the user can easily modify parameters such as their geometrical features and materials. This paper thoroughly describes the steps to create a physical model of a woodwind ancient Greek instrument, Plagiaulos. This instrument could be considered as the ancestor of the modern flute due to the common geometry and air-jet excitation mechanism. Plagiaulos is comprised of a single resonator with an open end and a number of tone holes. The combination of closed and open tone holes produces the pitch variations. In this work, the effects of all the instrument’s components are described by means of physics and then simulated based on digital waveguides. The synthesized sound of the proposed model complies with the theory, highlighting its validity. Further, the synthesized sound of the model simulating the Plagiaulos of Koile (2nd century BCE) was compared with its replica build in our laboratory by following the scientific methodologies of archeomusicology. The aforementioned results verify that robust dynamic digital tools can be introduced in the field of computational, experimental archaemusicology.

Keywords: archaeomusicology, digital waveguides, musical acoustics, physical modeling

Procedia PDF Downloads 113
15681 Forecasting Stock Prices Based on the Residual Income Valuation Model: Evidence from a Time-Series Approach

Authors: Chen-Yin Kuo, Yung-Hsin Lee

Abstract:

Previous studies applying residual income valuation (RIV) model generally use panel data and single-equation model to forecast stock prices. Unlike these, this paper uses Taiwan longitudinal data to estimate multi-equation time-series models such as Vector Autoregressive (VAR), Vector Error Correction Model (VECM), and conduct out-of-sample forecasting. Further, this work assesses their forecasting performance by two instruments. In favor of extant research, the major finding shows that VECM outperforms other three models in forecasting for three stock sectors over entire horizons. It implies that an error correction term containing long-run information contributes to improve forecasting accuracy. Moreover, the pattern of composite shows that at longer horizon, VECM produces the greater reduction in errors, and performs substantially better than VAR.

Keywords: residual income valuation model, vector error correction model, out of sample forecasting, forecasting accuracy

Procedia PDF Downloads 316
15680 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach

Authors: Elias K. Maragos, Petros E. Maravelakis

Abstract:

In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.

Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs

Procedia PDF Downloads 161
15679 Multiobjective Optimization of Wastwater Treatment by Electrochemical Process

Authors: Malek Bendjaballah, Hacina Saidi, Sarra Hamidoud

Abstract:

The aim of this study is to model and optimize the performance of a new electrocoagulation (E.C) process for the treatment of wastewater as well as the energy consumption in order to extrapolate it to the industrial scale. Through judicious application of an experimental design (DOE), it has been possible to evaluate the individual effects and interactions that have a significant influence on both objective functions (maximizing efficiency and minimizing energy consumption) by using aluminum electrodes as sacrificial anode. Preliminary experiments have shown that the pH of the medium, the applied potential and the treatment time with E.C are the main parameters. A factorial design 33 has been adopted to model performance and energy consumption. Under optimal conditions, the pollution reduction efficiency is 93%, combined with a minimum energy consumption of 2.60.10-3 kWh / mg-COD. The potential or current applied and the processing time and their interaction were the most influential parameters in the mathematical models obtained. The results of the modeling were also correlated with the experimental ones. The results offer promising opportunities to develop a clean process and inexpensive technology to eliminate or reduce wastewater,

Keywords: electrocoagulation, green process, experimental design, optimization

Procedia PDF Downloads 97
15678 Spatial Data Mining by Decision Trees

Authors: Sihem Oujdi, Hafida Belbachir

Abstract:

Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.

Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining

Procedia PDF Downloads 612
15677 Magnetic versus Non-Magnetic Adatoms in Graphene Nanoribbons: Tuning of Spintronic Applications and the Quantum Spin Hall Phase

Authors: Saurabh Basu, Sudin Ganguly

Abstract:

Conductance in graphene nanoribbons (GNR) in presence of magnetic (for example, Iron) and non-magnetic (for example, Gold) adatoms are explored theoretically within a Kane-Mele model for their possible spintronic applications and topologically non-trivial properties. In our work, we have considered the magnetic adatoms to induce a Rashba spin-orbit coupling (RSOC) and an exchange bias field, while the non-magnetic ones induce an RSOC and an intrinsic spin-orbit (SO) coupling. Even though RSOC is present in both, they, however, represent very different physical situations, where the magnetic adatoms do not preserve the time reversal symmetry, while the non-magnetic case does. This has important implications on the topological properties. For example, the non-magnetic adatoms, for moderately strong values of SO, the GNR denotes a quantum spin Hall insulator as evident from a 2e²/h plateau in the longitudinal conductance and presence of distinct conducting edge states with an insulating bulk. Since the edge states are protected by time reversal symmetry, the magnetic adatoms in GNR yield trivial insulators and do not possess any non-trivial topological property. However, they have greater utility than the non-magnetic adatoms from the point of view of spintronic applications. Owing to the broken spatial symmetry induced by the presence of adatoms of either type, all the x, y and z components of the spin-polarized conductance become non-zero (only the y-component survives in pristine Graphene owing to a mirror symmetry present there) and hence become suitable for spintronic applications. However, the values of the spin polarized conductances are at least two orders of magnitude larger in the case of magnetic adatoms than their non-magnetic counterpart, thereby ensuring more efficient spintronic applications. Further the applications are tunable by altering the adatom densities.

Keywords: magnetic and non-magnetic adatoms, quantum spin hall phase, spintronic applications, spin polarized conductance, time reversal symmetry

Procedia PDF Downloads 302
15676 Method and System of Malay Traditional Women Apparel Pattern Drafting for Hazi Attire

Authors: Haziyah Hussin

Abstract:

Hazi Attire software is purposely designed to be used for pattern drafting of the Malay Traditional Women Apparel. It is software created using LISP Program that works under AutoCAD engine and able to draft various patterns for Malay women apparels from fitted, semi-fitted and loose silhouettes. It is fully automatic and the user can select styles from the menu on the screen and enter the measurements. Within five seconds patterns are ready to be printed and sewn. Hazi Attire is different from other programmes available in the market since it is fully automatic, user-friendly and able to print selected pattern chosen quickly and accurately. With this software (Hazi Attire), the selected styles can be generated the pattern according to made-to-measure or standard sizes. It would benefit the apparel industries by reducing manufacturing lead time and cycle time.

Keywords: basic pattern, pattern drafting, toile, Malay traditional women apparel, the measurement parameters, fitted, semi-fitted and loose silhouette

Procedia PDF Downloads 269
15675 Sulfamethaxozole (SMX) Removal by Microwave-Assisted Heterogenous Fenton Reaction Involving Synthetic Clay (LDHS)

Authors: Chebli Derradji, Abdallah Bouguettoucha, Zoubir Manaa, S. Nacef, A. Amrane

Abstract:

Antibiotics are major pollutants of wastewater not only due to their stability in biological systems, but also due to their impact on public health. Their degradation by means of hydroxyl radicals generated through the application of microwave in the presence of hydrogen peroxide and two solid catalysts, iron-based synthetic clay (LDHs) and goethite (FeOOH) have been examined. A drastic reduction of the degradation yield was observed above pH 4, and hence the optimal conditions were found to be a pH of 3, 0.1 g/L of clay, a somewhat low amount of H2O2 (1.74 mmol/L) and a microwave intensity of 850 W. It should be observed that to maintain an almost constant temperature, a cooling with cold water was always applied between two microwaves running; and hence the ratio between microwave heating time and cooling time was 1. The obtained SMX degradation was 98.8 ± 0.2% after 30 minutes of microwave treatment. It should be observed that in the absence of the solid catalyst, LDHs, no SMX degradation was observed. From this, the use of microwave in the presence of a solid source of iron (LDHs) appears to be an efficient solution for the treatment of wastewater containing SMX.

Keywords: microwave, fenton, heterogenous fenton, degradation, oxidation, antibiotics

Procedia PDF Downloads 280
15674 Examination of How Do Smart Watches Influence the Market of Luxury Watches with Particular Regard of the Buying-Reasons

Authors: Christopher Benedikt Jakob

Abstract:

In our current society, there is no need to take a look at the wristwatch to know the exact time. Smartphones, the watch in the car or the computer watch, inform us about the time too. Over hundreds of years, luxury watches have held a fascination for human beings. Consumers buy watches that cost thousands of euros, although they could buy much cheaper watches which also fulfill the function to indicate the correct time. This shows that the functional value has got a minor meaning with reference to the buying-reasons as regards luxury watches. For a few years, people have an increased demand to track data like their walking distance per day or to track their sleep for example. Smart watches enable consumers to get information about these data. There exists a trend that people intend to optimise parts of their social life, and thus they get the impression that they are able to optimise themselves as human beings. With the help of smart watches, they are able to optimise parts of their productivity and to realise their targets at the same time. These smart watches are also offered as luxury models, and the question is: how will customers of traditional luxury watches react? Therefore this study has the intention to give answers to the question why people are willing to spend an enormous amount of money on the consumption of luxury watches. The self-expression model, the relationship basis model, the functional benefit representation model and the means-end-theory are chosen as an appropriate methodology to find reasons why human beings purchase specific luxury watches and luxury smart watches. This evaluative approach further discusses these strategies concerning for example if consumers buy luxury watches/smart watches to express the current self or the ideal self and if human beings make decisions on expected results. The research critically evaluates that relationships are compared on the basis of their advantages. Luxury brands offer socio-emotional advantages like social functions of identification and that the strong brand personality of luxury watches and luxury smart watches helps customers to structure and retrieve brand awareness which simplifies the process of decision-making. One of the goals is to identify if customers know why they like specific luxury watches and dislike others although they are produced in the same country and cost comparable prices. It is very obvious that the market for luxury watches especially for luxury smart watches is changing way faster than it has been in the past. Therefore the research examines the market changing parameters in detail.

Keywords: buying-behaviour, brand management, consumer, luxury watch, smart watch

Procedia PDF Downloads 210
15673 Student Diversity in Higher Education: The Impact of Digital Elements on Student Learning Behavior and Subject-Specific Preferences

Authors: Pia Kastl

Abstract:

By combining face-to-face sessions with digital selflearning units, the learning process can be enhanced and learning success improved. Potentials of blended learning are the flexibility and possibility to get in touch with lecturers and fellow students face-toface. It also offers the opportunity to individualize and self-regulate the learning process. Aim of this article is to analyse how different learning environments affect students’ learning behavior and how digital tools can be used effectively. The analysis also considers the extent to which the field of study affects the students’ preferences. Semi-structured interviews were conducted with students from different disciplines at two German universities (N= 60). The questions addressed satisfaction and perception of online, faceto-face and blended learning courses. In addition, suggestions for improving learning experience and the use of digital tools in the different learning environments were surveyed. The results show that being present on campus has a positive impact on learning success and online teaching facilitates flexible learning. Blended learning can combine the respective benefits, although one challenge is to keep the time investment within reasonable limits. The use of digital tools differs depending on the subject. Medical students are willing to use digital tools to improve their learning success and voluntarily invest more time. Students of the humanities and social sciences, on the other hand, are reluctant to invest additional time. They do not see extra study material as an additional benefit their learning success. This study illustrates how these heterogenous demands on learning environments can be met. In addition, potential for improvement will be identified in order to foster both learning process and learning success. Learning environments can be meaningfully enriched with digital elements to address student diversity in higher education.

Keywords: blended learning, higher education, diversity, learning styles

Procedia PDF Downloads 70
15672 Study the effect of bulk traps on Solar Blind Photodetector Based on an IZTO/β Ga2O3/ITO Schottky Diode

Authors: Laboratory of Semiconducting, Metallic Materials (LMSM) Biskra Algeria

Abstract:

InZnSnO2 (IZTO)/β-Ga2O3 Schottky solar barrier photodetector (PhD) exposed to 255 nm was simulated and compared to the measurement. Numerical simulations successfully reproduced the photocurrent at reverse bias and response by taking into account several factors, such as conduction mechanisms and material parameters. By adopting reducing the density of the trap as an improvement. The effect of reducing the bulk trap densities on the photocurrent, response, and time-dependent (continuous conductivity) was studied. As the trap density decreased, the photocurrent increased. The response was 0.04 A/W for the low Ga2O3 trap density. The estimated decay time for the lowest intensity ET (0.74, 1.04 eV) is 0.05 s and is shorter at ∼0.015 s for ET (0.55 eV). This indicates that the shallow traps had the dominant effect (ET = 0.55 eV) on the continuous photoconductivity phenomenon. Furthermore, with decreasing trap densities, this PhD can be considered as a self-powered solar-blind photodiode (SBPhD).

Keywords: IZTO/β-Ga2O3, self-powered solar-blind photodetector, numerical simulation, bulk traps

Procedia PDF Downloads 86
15671 Effect of Thermal Treatment on Phenolic Content, Antioxidant, and Alpha-Amylase Inhibition Activities of Moringa stenopetala Leaves

Authors: Daniel Assefa, Engeda Dessalegn, Chetan Chauhan

Abstract:

Moringa stenopetala is a socioeconomic valued tree that is widely available and cultivated in the Southern part of Ethiopia. The leaves have been traditionally used as a food source with high nutritional and medicinal values. The present work was carried out to evaluate the effect of thermal treatment on the total phenolic content, antioxidant and alpha-amylase inhibition activities of aqueous leaf extracts during maceration and different decoction time interval (5, 10 and 15 min). The total phenolic content was determined by the Folin-ciocalteu methods whereas antioxidant activities were determined by 2,2-diphenyl-1-picryl-hydrazyl(DPPH) radical scavenging, reducing power and ferrous ion chelating assays and alpha-amylase inhibition activity was determined using 3,5-dinitrosalicylic acid method. Total phenolic content ranged from 34.35 to 39.47 mgGAE/g. Decoction for 10 min extract showed ferrous ion chelating (92.52), DPPH radical scavenging (91.52%), alpha-amylase inhibition (69.06%) and ferric reducing power (0.765), respectively. DPPH, reducing power and alpha-amylase inhibition activities showed positive linear correlation (R2=0.853, R2= 0.857 and R2=0.930), respectively with total phenolic content but ferrous ion chelating activity was found to be weakly correlated (R2=0.481). Based on the present investigation, it could be concluded that major loss of total phenolic content, antioxidant and alpha-amylase inhibition activities of the crude leaf extracts of Moringa stenopetala leaves were observed at decoction time for 15 min. Therefore, to maintain the total phenolic content, antioxidant, and alpha-amylase inhibition activities of leaves, cooking practice should be at the optimum decoction time (5-10 min).

Keywords: alpha-amylase inhibition, antioxidant, Moringa stenopetala, total phenolic content

Procedia PDF Downloads 361
15670 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 82
15669 Structuring of Multilayer Aluminum Nickel by Lift-off Process Using Cheap Negative Resist

Authors: Muhammad Talal Asghar

Abstract:

The lift-off technique of the photoresist for metal patterning in integrated circuit (IC) packaging has been widely utilized in the field of microelectromechanical systems and semiconductor component manufacturing. The main advantage lies in cost-saving, reduction in complexity, and maturity of the process. The selection of photoresist depends upon many factors such as cost, the thickness of the resist, comfortable and valuable parameters extraction. In the present study, an extremely cheap dry film photoresist E8015 of thickness 38-micrometer is processed for the first time for edge profiling, according to the author's best knowledge. Successful extraction of the helpful parameter range for resist processing is performed. An undercut angle of 66 to 73 degrees is realized by parameter variation like exposure energy and development time. Finally, 10-micrometer thick metallic multilayer aluminum nickel is lifted off on the plain silicon wafer. Possible applications lie in controlled self-propagating reactions within structured metallic multilayer that may be utilized for IC packaging in the future.

Keywords: lift-off, IC packaging, photoresist, multilayer

Procedia PDF Downloads 212
15668 Gearbox Defect Detection in the Semi Autogenous Mills Using the Vibration Analysis Technique

Authors: Mostafa Firoozabadi, Alireza Foroughi Nematollahi

Abstract:

Semi autogenous mills are designed for grinding or primary crushed ore, and are the most widely used in concentrators globally. Any defect occurrence in semi autogenous mills can stop the production line. A Gearbox is a significant part of a rotating machine or a mill, so, the gearbox monitoring is a necessary process to prevent the unwanted defects. When a defect happens in a gearbox bearing, this defect can be transferred to the other parts of the equipment like inner ring, outer ring, balls, and the bearing cage. Vibration analysis is one of the most effective and common ways to detect the bearing defects in the mills. Vibration signal in a mill can be made by different parts of the mill including electromotor, pinion girth gear, different rolling bearings, and tire. When a vibration signal, made by the aforementioned parts, is added to the gearbox vibration spectrum, an accurate and on time defect detection in the gearbox will be difficult. In this paper, a new method is proposed to detect the gearbox bearing defects in the semi autogenous mill on time and accurately, using the vibration signal analysis method. In this method, if the vibration values are increased in the vibration curve, the probability of defect occurrence is investigated by comparing the equipment vibration values and the standard ones. Then, all vibration frequencies are extracted from the vibration signal and the equipment defect is detected using the vibration spectrum curve. This method is implemented on the semi autogenous mills in the Golgohar mining and industrial company in Iran. The results show that the proposed method can detect the bearing looseness on time and accurately. After defect detection, the bearing is opened before the equipment failure and the predictive maintenance actions are implemented on it.

Keywords: condition monitoring, gearbox defects, predictive maintenance, vibration analysis

Procedia PDF Downloads 465
15667 The Use of Learning Management Systems during Emerging the Tacit Knowledge

Authors: Ercan Eker, Muhammer Karaman, Akif Aslan, Hakan Tanrikuluoglu

Abstract:

Deficiency of institutional memory and knowledge management can result in information security breaches, loss of prestige and trustworthiness and the worst the loss of know-how and institutional knowledge. Traditional learning management within organizations is generally handled by personal efforts. That kind of struggle mostly depends on personal desire, motivation and institutional belonging. Even if an organization has highly motivated employees at a certain time, the institutional knowledge and memory life cycle will generally remain limited to these employees’ spending time in this organization. Having a learning management system in an organization can sustain the institutional memory, knowledge and know-how in the organization. Learning management systems are much more needed especially in public organizations where the job rotation is frequently seen and managers are appointed periodically. However, a learning management system should not be seen as an organizations’ website. It is a more comprehensive, interactive and user-friendly knowledge management tool for organizations. In this study, the importance of using learning management systems in the process of emerging tacit knowledge is underlined.

Keywords: knowledge management, learning management systems, tacit knowledge, institutional memory

Procedia PDF Downloads 380
15666 Benchmarking of Pentesting Tools

Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.

Keywords: cybersecurity, IDS, security, web scanners, web vulnerabilities

Procedia PDF Downloads 319
15665 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation

Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner

Abstract:

This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.

Keywords: business valuation, corporate finance, digitisation, disruption

Procedia PDF Downloads 133