Search results for: student performance prediction
5055 Instant Fire Risk Assessment Using Artifical Neural Networks
Authors: Tolga Barisik, Ali Fuat Guneri, K. Dastan
Abstract:
Major industrial facilities have a high potential for fire risk. In particular, the indices used for the detection of hidden fire are used very effectively in order to prevent the fire from becoming dangerous in the initial stage. These indices provide the opportunity to prevent or intervene early by determining the stage of the fire, the potential for hazard, and the type of the combustion agent with the percentage values of the ambient air components. In this system, artificial neural network will be modeled with the input data determined using the Levenberg-Marquardt algorithm, which is a multi-layer sensor (CAA) (teacher-learning) type, before modeling the modeling methods in the literature. The actual values produced by the indices will be compared with the outputs produced by the network. Using the neural network and the curves to be created from the resulting values, the feasibility of performance determination will be investigated.Keywords: artifical neural networks, fire, Graham Index, levenberg-marquardt algoritm, oxygen decrease percentage index, risk assessment, Trickett Index
Procedia PDF Downloads 1425054 Video Foreground Detection Based on Adaptive Mixture Gaussian Model for Video Surveillance Systems
Authors: M. A. Alavianmehr, A. Tashk, A. Sodagaran
Abstract:
Modeling background and moving objects are significant techniques for video surveillance and other video processing applications. This paper presents a foreground detection algorithm that is robust against illumination changes and noise based on adaptive mixture Gaussian model (GMM), and provides a novel and practical choice for intelligent video surveillance systems using static cameras. In the previous methods, the image of still objects (background image) is not significant. On the contrary, this method is based on forming a meticulous background image and exploiting it for separating moving objects from their background. The background image is specified either manually, by taking an image without vehicles, or is detected in real-time by forming a mathematical or exponential average of successive images. The proposed scheme can offer low image degradation. The simulation results demonstrate high degree of performance for the proposed method.Keywords: image processing, background models, video surveillance, foreground detection, Gaussian mixture model
Procedia PDF Downloads 5215053 Stabilization of a Three-Pole Active Magnetic Bearing by Hybrid Control Method in Static Mode
Authors: Mahdi Kiani, Hassan Salarieh, Aria Alasty, S. Mahdi Darbandi
Abstract:
The design and implementation of the hybrid control method for a three-pole active magnetic bearing (AMB) is proposed in this paper. The system is inherently nonlinear and conventional nonlinear controllers are a little complicated, while the proposed hybrid controller has a piecewise linear form, i.e. linear in each sub-region. A state-feedback hybrid controller is designed in this study, and the unmeasurable states are estimated by an observer. The gains of the hybrid controller are obtained by the Linear Quadratic Regulator (LQR) method in each sub-region. To evaluate the performance, the designed controller is implemented on an experimental setup in static mode. The experimental results show that the proposed method can efficiently stabilize the three-pole AMB system. The simplicity of design, domain of attraction, uncomplicated control law, and computational time are advantages of this method over other nonlinear control strategies in AMB systems.Keywords: active magnetic bearing, three pole AMB, hybrid control, Lyapunov function
Procedia PDF Downloads 3455052 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity
Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle
Abstract:
The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.Keywords: complex-valued signal processing, synthetic aperture radar, 2-D radar imaging, compressive sensing, sparse Bayesian learning
Procedia PDF Downloads 1385051 Performance Comparison of AODV and Soft AODV Routing Protocol
Authors: Abhishek, Seema Devi, Jyoti Ohri
Abstract:
A mobile ad hoc network (MANET) represents a system of wireless mobile nodes that can self-organize freely and dynamically into arbitrary and temporary network topology. Unlike a wired network, wireless network interface has limited transmission range. Routing is the task of forwarding data packets from source to a given destination. Ad-hoc On Demand Distance Vector (AODV) routing protocol creates a path for a destination only when it required. This paper describes the implementation of AODV routing protocol using MATLAB-based Truetime simulator. In MANET's node movements are not fixed while they are random in nature. Hence intelligent techniques i.e. fuzzy and ANFIS are used to optimize the transmission range. In this paper, we compared the transmission range of AODV, fuzzy AODV and ANFIS AODV. For soft computing AODV, we have taken transmitted power and received threshold as input and transmission range as output. ANFIS gives better results as compared to fuzzy AODV.Keywords: ANFIS, AODV, fuzzy, MANET, reactive routing protocol, routing protocol, truetime
Procedia PDF Downloads 5025050 Effectiveness of New Digital Tools on Implementing Quality Management System: An Exploratory Study of French Companies
Authors: Takwa Belwakess
Abstract:
With the wave of the digitization that invades the modern world, communication tools took their place in the world of business. As for organizations, being part of the digital era necessarily involves an evolution of the management style, mainly in processes management, knowing also as quality management system (QMS). For more than 50 years quality management standards have been adopted by organizations to prove their operational and financial performances. We believe that achieving a high-level of communication can lead to better quality management and greater customer satisfaction, which is essential to make sure long-term competitiveness. In this paper, a questionnaire survey was developed to investigate the use of collaboration tools such as Content Management System and Social Networks. Data from more than 100 companies based in France was analyzed, the results show that adopting new digital communication tools while applying quality management practices over a reasonable period, contributed to delivering a better implementation of the QMS for a better business performance.Keywords: communication tools, content management system, digital, effectiveness, French companies, quality management system, quality management practices, social networks
Procedia PDF Downloads 2725049 Frequent-Pattern Tree Algorithm Application to S&P and Equity Indexes
Authors: E. Younsi, H. Andriamboavonjy, A. David, S. Dokou, B. Lemrabet
Abstract:
Software and time optimization are very important factors in financial markets, which are competitive fields, and emergence of new computer tools further stresses the challenge. In this context, any improvement of technical indicators which generate a buy or sell signal is a major issue. Thus, many tools have been created to make them more effective. This worry about efficiency has been leading in present paper to seek best (and most innovative) way giving largest improvement in these indicators. The approach consists in attaching a signature to frequent market configurations by application of frequent patterns extraction method which is here most appropriate to optimize investment strategies. The goal of proposed trading algorithm is to find most accurate signatures using back testing procedure applied to technical indicators for improving their performance. The problem is then to determine the signatures which, combined with an indicator, outperform this indicator alone. To do this, the FP-Tree algorithm has been preferred, as it appears to be the most efficient algorithm to perform this task.Keywords: quantitative analysis, back-testing, computational models, apriori algorithm, pattern recognition, data mining, FP-tree
Procedia PDF Downloads 3665048 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data
Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park
Abstract:
We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence
Procedia PDF Downloads 4475047 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor
Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric
Abstract:
Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.Keywords: car-detector, HOG, motion, computing time
Procedia PDF Downloads 3255046 Interactive Shadow Play Animation System
Authors: Bo Wan, Xiu Wen, Lingling An, Xiaoling Ding
Abstract:
The paper describes a Chinese shadow play animation system based on Kinect. Users, without any professional training, can personally manipulate the shadow characters to finish a shadow play performance by their body actions and get a shadow play video through giving the record command to our system if they want. In our system, Kinect is responsible for capturing human movement and voice commands data. Gesture recognition module is used to control the change of the shadow play scenes. After packaging the data from Kinect and the recognition result from gesture recognition module, VRPN transmits them to the server-side. At last, the server-side uses the information to control the motion of shadow characters and video recording. This system not only achieves human-computer interaction, but also realizes the interaction between people. It brings an entertaining experience to users and easy to operate for all ages. Even more important is that the application background of Chinese shadow play embodies the protection of the art of shadow play animation.Keywords: hadow play animation, Kinect, gesture recognition, VRPN, HCI
Procedia PDF Downloads 4065045 Application of 3D Apparel CAD for Costume Reproduction
Authors: Zi Y. Kang, Tracy D. Cassidy, Tom Cassidy
Abstract:
3D apparel CAD is one of the remarkable products in advanced technology which enables intuitive design, visualisation and evaluation of garments through stereoscopic drape simulation. The progressive improvements of 3D apparel CAD have led to the creation of more realistic clothing simulation which is used not only in design development but also in presentation, promotion and communication for fashion as well as other industries such as film, game and social network services. As a result, 3D clothing technology is becoming more ubiquitous in human culture and lives today. This study considers that such phenomenon implies that the technology has reached maturity and it is time to inspect the status of current technology and to explore its potential uses in ways to create cultural values to further move forward. For this reason, this study aims to generate virtual costumes as culturally significant objects using 3D apparel CAD and to assess its capability, applicability and attitudes of the audience towards clothing simulation through comparison with physical counterparts. Since the access to costume collection is often limited due to the conservative issues, the technology may make valuable contribution by democratization of culture and knowledge for museums and its audience. This study is expected to provide foundation knowledge for development of clothing technology and for expanding its boundary of practical uses. To prevent any potential damage, two replicas of the costumes in the 1860s and 1920s at the Museum of London were chosen as samples. Their structural, visual and physical characteristics were measured and collected using patterns, scanned images of fabrics and objective fabric measurements with scale, KES-F (Kawabata Evaluation System of Fabrics) and Titan. Commercial software, DC Suite 5.0 was utilised to create virtual costumes applying collected data and the following outcomes were produced for the evaluation: Images of virtual costumes and video clips showing static and dynamic simulation. Focus groups were arranged with fashion design students and the public for evaluation which exposed the outcomes together with physical samples, fabrics swatches and photographs. The similarities, application and acceptance of virtual costumes were estimated through discussion and a questionnaire. The findings show that the technology has the capability to produce realistic or plausible simulation but expression of some factors such as details and capability of light material requires improvements. While the use of virtual costumes was viewed as more interesting and futuristic replacements to physical objects by the public group, the fashion student group noted more differences in detail and preferred physical garments highlighting the absence of tangibility. However, the advantages and potential of virtual costumes as effective and useful visual references for educational and exhibitory purposes were underlined by both groups. Although 3D apparel CAD has sufficient capacity to assist garment design process, it has limits in identical replication and more study on accurate reproduction of details and drape is needed for its technical improvements. Nevertheless, the virtual costumes in this study demonstrated the possibility of the technology to contribute to cultural and knowledgeable value creation through its applicability and as an interesting way to offer 3D visual information.Keywords: digital clothing technology, garment simulation, 3D Apparel CAD, virtual costume
Procedia PDF Downloads 2245044 Detecting and Secluding Route Modifiers by Neural Network Approach in Wireless Sensor Networks
Authors: C. N. Vanitha, M. Usha
Abstract:
In a real world scenario, the viability of the sensor networks has been proved by standardizing the technologies. Wireless sensor networks are vulnerable to both electronic and physical security breaches because of their deployment in remote, distributed, and inaccessible locations. The compromised sensor nodes send malicious data to the base station, and thus, the total network effectiveness will possibly be compromised. To detect and seclude the Route modifiers, a neural network based Pattern Learning predictor (PLP) is presented. This algorithm senses data at any node on present and previous patterns obtained from the en-route nodes. The eminence of any node is upgraded by their predicted and reported patterns. This paper propounds a solution not only to detect the route modifiers, but also to seclude the malevolent nodes from the network. The simulation result proves the effective performance of the network by the presented methodology in terms of energy level, routing and various network conditions.Keywords: neural networks, pattern learning, security, wireless sensor networks
Procedia PDF Downloads 4085043 Analysis of Fault Tolerance on Grid Computing in Real Time Approach
Authors: Parampal Kaur, Deepak Aggarwal
Abstract:
In the computational Grid, fault tolerance is an imperative issue to be considered during job scheduling. Due to the widespread use of resources, systems are highly prone to errors and failures. Hence, fault tolerance plays a key role in the grid to avoid the problem of unreliability. Scheduling the task to the appropriate resource is a vital requirement in computational Grid. The fittest resource scheduling algorithm searches for the appropriate resource based on the job requirements, in contrary to the general scheduling algorithms where jobs are scheduled to the resources with best performance factor. The proposed method is to improve the fault tolerance of the fittest resource scheduling algorithm by scheduling the job in coordination with job replication when the resource has low reliability. Based on the reliability index of the resource, the resource is identified as critical. The tasks are scheduled based on the criticality of the resources. Results show that the execution time of the tasks is comparatively reduced with the proposed algorithm using real-time approach rather than a simulator.Keywords: computational grid, fault tolerance, task replication, job scheduling
Procedia PDF Downloads 4415042 Familial Exome Sequencing to Decipher the Complex Genetic Basis of Holoprosencephaly
Authors: Artem Kim, Clara Savary, Christele Dubourg, Wilfrid Carre, Houda Hamdi-Roze, Valerie Dupé, Sylvie Odent, Marie De Tayrac, Veronique David
Abstract:
Holoprosencephaly (HPE) is a rare congenital brain malformation resulting from the incomplete separation of the two cerebral hemispheres. It is characterized by a wide phenotypic spectrum and a high degree of locus heterogeneity. Genetic defects in 16 genes have already been implicated in HPE, but account for only 30% of cases, suggesting that a large part of genetic factors remains to be discovered. HPE has been recently redefined as a complex multigenic disorder, requiring the joint effect of multiple mutational events in genes belonging to one or several developmental pathways. The onset of HPE may result from accumulation of the effects of multiple rare variants in functionally-related genes, each conferring a moderate increase in the risk of HPE onset. In order to decipher the genetic basis of HPE, unconventional patterns of inheritance involving multiple genetic factors need to be considered. The primary objective of this study was to uncover possible disease causing combinations of multiple rare variants underlying HPE by performing trio-based Whole Exome Sequencing (WES) of familial cases where no molecular diagnosis could be established. 39 families were selected with no fully-penetrant causal mutation in known HPE gene, no chromosomic aberrations/copy number variants and without any implication of environmental factors. As the main challenge was to identify disease-related variants among a large number of nonpathogenic polymorphisms detected by WES classical scheme, a novel variant prioritization approach was established. It combined WES filtering with complementary gene-level approaches: transcriptome-driven (RNA-Seq data) and clinically-driven (public clinical data) strategies. Briefly, a filtering approach was performed to select variants compatible with disease segregation, population frequency and pathogenicity prediction to identify an exhaustive list of rare deleterious variants. The exome search space was then reduced by restricting the analysis to candidate genes identified by either transcriptome-driven strategy (genes sharing highly similar expression patterns with known HPE genes during cerebral development) or clinically-driven strategy (genes associated to phenotypes of interest overlapping with HPE). Deeper analyses of candidate variants were then performed on a family-by-family basis. These included the exploration of clinical information, expression studies, variant characteristics, recurrence of mutated genes and available biological knowledge. A novel bioinformatics pipeline was designed. Applied to the 39 families, this final integrated workflow identified an average of 11 candidate variants per family. Most of candidate variants were inherited from asymptomatic parents suggesting a multigenic inheritance pattern requiring the association of multiple mutational events. The manual analysis highlighted 5 new strong HPE candidate genes showing recurrences in distinct families. Functional validations of these genes are foreseen.Keywords: complex genetic disorder, holoprosencephaly, multiple rare variants, whole exome sequencing
Procedia PDF Downloads 2075041 Adaptive E-Learning System Using Fuzzy Logic and Concept Map
Authors: Mesfer Al Duhayyim, Paul Newbury
Abstract:
This paper proposes an effective adaptive e-learning system that uses a coloured concept map to show the learner's knowledge level for each concept in the chosen subject area. A Fuzzy logic system is used to evaluate the learner's knowledge level for each concept in the domain, and produce a ranked concept list of learning materials to address weaknesses in the learner’s understanding. This system obtains information on the learner's understanding of concepts by an initial pre-test before the system is used for learning and a post-test after using the learning system. A Fuzzy logic system is used to produce a weighted concept map during the learning process. The aim of this research is to prove that such a proposed novel adapted e-learning system will enhance learner's performance and understanding. In addition, this research aims to increase participants' overall understanding of their learning level by providing a coloured concept map of understanding followed by a ranked concepts list of learning materials.Keywords: adaptive e-learning system, coloured concept map, fuzzy logic, ranked concept list
Procedia PDF Downloads 2975040 Genetic Algorithms Based ACPS Safety
Authors: Emine Laarouchi, Daniela Cancila, Laurent Soulier, Hakima Chaouchi
Abstract:
Cyber-Physical Systems as drones proved their efficiency for supporting emergency applications. For these particular applications, travel time and autonomous navigation algorithms are of paramount importance, especially when missions are performed in urban environments with high obstacle density. In this context, however, safety properties are not properly addressed. Our ambition is to optimize the system safety level under autonomous navigation systems, by preserving performance of the CPS. At this aim, we introduce genetic algorithms in the autonomous navigation process of the drone to better infer its trajectory considering the possible obstacles. We first model the wished safety requirements through a cost function and then seek to optimize it though genetics algorithms (GA). The main advantage in the use of GA is to consider different parameters together, for example, the level of battery for navigation system selection. Our tests show that the GA introduction in the autonomous navigation systems minimize the risk of safety lossless. Finally, although our simulation has been tested for autonomous drones, our approach and results could be extended for other autonomous navigation systems such as autonomous cars, robots, etc.Keywords: safety, unmanned aerial vehicles , CPS, ACPS, drones, path planning, genetic algorithms
Procedia PDF Downloads 1855039 From Biowaste to Biobased Products: Life Cycle Assessment of VALUEWASTE Solution
Authors: Andrés Lara Guillén, José M. Soriano Disla, Gemma Castejón Martínez, David Fernández-Gutiérrez
Abstract:
The worldwide population is exponentially increasing, which causes a rising demand for food, energy and non-renewable resources. These demands must be attended to from a circular economy point of view. Under this approach, the obtention of strategic products from biowaste is crucial for the society to keep the current lifestyle reducing the environmental and social issues linked to the lineal economy. This is the main objective of the VALUEWASTE project. VALUEWASTE is about valorizing urban biowaste into proteins for food and feed and biofertilizers, closing the loop of this waste stream. In order to achieve this objective, the project validates three value chains, which begin with the anaerobic digestion of the biowaste. From the anaerobic digestion, three by-products are obtained: i) methane that is used by microorganisms, which will be transformed into microbial proteins; ii) digestate that is used by black soldier fly, producing insect proteins; and iii) a nutrient-rich effluent, which will be transformed into biofertilizers. VALUEWASTE is an innovative solution, which combines different technologies to valorize entirely the biowaste. However, it is also required to demonstrate that the solution is greener than other traditional technologies (baseline systems). On one hand, the proteins from microorganisms and insects will be compared with other reference protein production systems (gluten, whey and soybean). On the other hand, the biofertilizers will be compared to the production of mineral fertilizers (ammonium sulphate and synthetic struvite). Therefore, the aim of this study is to provide that biowaste valorization can reduce the environmental impacts linked to both traditional proteins manufacturing processes and mineral fertilizers, not only at a pilot-scale but also at an industrial one. In the present study, both baseline system and VALUEWASTE solution are evaluated through the Environmental Life Cycle Assessment (E-LCA). The E-LCA is based on the standards ISO 14040 and 14044. The Environmental Footprint methodology was the one used in this study to evaluate the environmental impacts. The results for the baseline cases show that the food proteins coming from whey have the highest environmental impact on ecosystems compared to the other proteins sources: 7.5 and 15.9 folds higher than soybean and gluten, respectively. Comparing feed soybean and gluten, soybean has an environmental impact on human health 195.1 folds higher. In the case of biofertilizers, synthetic struvite has higher impacts than ammonium sulfate: 15.3 (ecosystems) and 11.8 (human health) fold, respectively. The results shown in the present study will be used as a reference to demonstrate the better environmental performance of the bio-based products obtained through the VALUEWASTE solution. Other originalities that the E-LCA performed in the VALUEWASTE project provides are the diverse direct implications on investment and policies. On one hand, better environmental performance will serve to remove the barriers linked to these kinds of technologies, boosting the investment that is backed by the E-LCA. On the other hand, it will be a germ to design new policies fostering these types of solutions to achieve two of the key targets of the European Community: being self-sustainable and carbon neutral.Keywords: anaerobic digestion, biofertilizers, circular economy, nutrients recovery
Procedia PDF Downloads 925038 A Pole Radius Varying Notch Filter with Transient Suppression for Electrocardiogram
Authors: Ramesh Rajagopalan, Adam Dahlstrom
Abstract:
Noise removal techniques play a vital role in the performance of electrocardiographic (ECG) signal processing systems. ECG signals can be corrupted by various kinds of noise such as baseline wander noise, electromyographic interference, and power-line interference. One of the significant challenges in ECG signal processing is the degradation caused by additive 50 or 60 Hz power-line interference. This work investigates the removal of power line interference and suppression of transient response for filtering noise corrupted ECG signals. We demonstrate the effectiveness of Infinite Impulse Response (IIR) notch filter with time varying pole radius for improving the transient behavior. The temporary change in the pole radius of the filter diminishes the transient behavior. Simulation results show that the proposed IIR filter with time varying pole radius outperforms traditional IIR notch filters in terms of mean square error and transient suppression.Keywords: notch filter, ECG, transient, pole radius
Procedia PDF Downloads 3835037 Government Policy over the Remuneration System of The Board of Commissioners in Indonesian Stated-Owned Enterprises
Authors: Synthia Atas Sari
Abstract:
The purpose of this paper is to examine the impact of reward system which determine by government over the work of Board of Commissioners to implement good corporate governance in Indonesian state-owned enterprises. To do so, this study analyzes the adequacy of the remuneration, the job attractiveness, and the board commitment and dedication with the remuneration system. Qualitative method used to examine the significant features and challenges to the government policy over the remuneration determination for the board of commissioners to their roles. Data gathered through semi-structure in-depth interview to the twenty-one participants over nine Indonesian stated-owned enterprises and written documents. Findings of this study indicate that government policies over the remuneration system is not effective to increase the performance of board of commissioners in implementing good corporate governance in Indonesian stated-owned enterprises due to unattractiveness of the remuneration amount, demotivate active members, and conflict interest over members of the remuneration committee.Keywords: reward system, board of commissioners, stated-owned enterprises, government policy
Procedia PDF Downloads 3415036 Distribution-Free Exponentially Weighted Moving Average Control Charts for Monitoring Process Variability
Authors: Chen-Fang Tsai, Shin-Li Lu
Abstract:
Distribution-free control chart is an oncoming area from the statistical process control charts in recent years. Some researchers have developed various nonparametric control charts and investigated the detection capability of these charts. The major advantage of nonparametric control charts is that the underlying process is not specifically considered the assumption of normality or any parametric distribution. In this paper, two nonparametric exponentially weighted moving average (EWMA) control charts based on nonparametric tests, namely NE-S and NE-M control charts, are proposed for monitoring process variability. Generally, weighted moving average (GWMA) control charts are extended by utilizing design and adjustment parameters for monitoring the changes in the process variability, namely NG-S and NG-M control charts. Statistical performance is also investigated on NG-S and NG-M control charts with run rules. Moreover, sensitivity analysis is performed to show the effects of design parameters under the nonparametric NG-S and NG-M control charts.Keywords: Distribution-free control chart, EWMA control charts, GWMA control charts
Procedia PDF Downloads 2775035 Treatment of Industrial Effluents by Using Polyethersulfone/Chitosan Membrane Derived from Fishery Waste
Authors: Suneeta Kumari, Abanti Sahoo
Abstract:
Industrial effluents treatment is a major problem in the world. All wastewater treatment methods have some problems in the environment. Due to this reason, today many natural biopolymers are being used in the waste water treatment because those are safe for our environment. In this study, synthesis and characterization of polyethersulfone/chitosan membranes (Thin film composite membrane) are carried out. Fish scales are used as raw materials. Different characterization techniques such as Fourier transform infrared spectroscopy (FTIR), X-ray powder diffraction (XRD), scanning electron microscope (SEM) and Thermal gravimetric analysis (TGA) are analysed for the synthesized membrane. The performance of membranes such as flux, rejection, and pore size are also checked. The synthesized membrane is used for the treatment of steel industry waste water where Biochemical oxygen demand (BOD), Chemical Oxygen Demand (COD), pH, colour, Total dissolved solids (TDS), Total suspended solids (TSS), Electrical conductivity (EC) and Turbidity aspects are analysed.Keywords: fish scale, membrane synthesis, treatment of industrial effluents, chitosan
Procedia PDF Downloads 3245034 The Extension of Monomeric Computational Results to Polymeric Measurable Properties: An Introductory Computational Chemistry Experiment
Authors: Jing Zhao, Yongqing Bai, Qiaofang Shi, Huaihao Zhang
Abstract:
Advances in software technology enable computational chemistry to be commonly applied in various research fields, especially in pedagogy. Thus, in order to expand and improve experimental instructions of computational chemistry for undergraduates, we designed an introductory experiment—research on acrylamide molecular structure and physicochemical properties. Initially, students construct molecular models of acrylamide and polyacrylamide in Gaussian and Materials Studio software respectively. Then, the infrared spectral data, atomic charge and molecular orbitals of acrylamide as well as solvation effect of polyacrylamide are calculated to predict their physicochemical performance. At last, rheological experiments are used to validate these predictions. Through the combination of molecular simulation (performed on Gaussian, Materials Studio) with experimental verification (rheology experiment), learners have deeply comprehended the chemical nature of acrylamide and polyacrylamide, achieving good learning outcomes.Keywords: upper-division undergraduate, computer-based learning, laboratory instruction, molecular modeling
Procedia PDF Downloads 1385033 The AI Arena: A Framework for Distributed Multi-Agent Reinforcement Learning
Authors: Edward W. Staley, Corban G. Rivera, Ashley J. Llorens
Abstract:
Advances in reinforcement learning (RL) have resulted in recent breakthroughs in the application of artificial intelligence (AI) across many different domains. An emerging landscape of development environments is making powerful RL techniques more accessible for a growing community of researchers. However, most existing frameworks do not directly address the problem of learning in complex operating environments, such as dense urban settings or defense-related scenarios, that incorporate distributed, heterogeneous teams of agents. To help enable AI research for this important class of applications, we introduce the AI Arena: a scalable framework with flexible abstractions for distributed multi-agent reinforcement learning. The AI Arena extends the OpenAI Gym interface to allow greater flexibility in learning control policies across multiple agents with heterogeneous learning strategies and localized views of the environment. To illustrate the utility of our framework, we present experimental results that demonstrate performance gains due to a distributed multi-agent learning approach over commonly-used RL techniques in several different learning environments.Keywords: reinforcement learning, multi-agent, deep learning, artificial intelligence
Procedia PDF Downloads 1645032 Evaluation of Wind Fragility for Set Anchor Used in Sign Structure in Korea
Authors: WooYoung Jung, Buntheng Chhorn, Min-Gi Kim
Abstract:
Recently, damage to domestic facilities by strong winds and typhoons are growing. Therefore, this study focused on sign structure among various vulnerable facilities. The evaluation of the wind fragility was carried out considering the destruction of the anchor, which is one of the various failure modes of the sign structure. The performance evaluation of the anchor was carried out to derive the wind fragility. Two parameters were set and four anchor types were selected to perform the pull-out and shear tests. The resistance capacity was estimated based on the experimental results. Wind loads were estimated using Monte Carlo simulation method. Based on these results, we derived the wind fragility according to anchor type and wind exposure category. Finally, the evaluation of the wind fragility was performed according to the experimental parameters such as anchor length and anchor diameter. This study shows that the depth of anchor was more significant for the safety of structure compare to diameter of anchor.Keywords: sign structure, wind fragility, set anchor, pull-out test, shear test, Monte Carlo simulation
Procedia PDF Downloads 2915031 The Measurement of City Brand Effectiveness as Methodological and Strategic Challenge: Insights from Individual Interviews with International Experts
Authors: A. Augustyn, M. Florek, M. Herezniak
Abstract:
Since the public authorities are constantly pressured by the public opinion to showcase the tangible and measurable results of their efforts, the evaluation of place brand-related activities becomes a necessity. Given the political and social character of place branding process, the legitimization of the branding efforts requires the compliance of the objectives set out in the city brand strategy with the actual needs, expectations, and aspirations of various internal stakeholders. To deliver on the diverse promises, city authorities and brand managers need to translate them into the measurable indicators against which the brand strategy effectiveness will be evaluated. In concert with these observations are the findings from branding and marketing literature with a widespread consensus that places should adopt a more systematic and holistic approach in order to ensure the performance of their brands. However, the measurement of the effectiveness of place branding remains insufficiently explored in theory, even though it is considered a significant step in the process of place brand management. Therefore, the aim of the research presented in the current paper was to collect insights on the nature of effectiveness measurement of city brand strategies and to juxtapose these findings with the theoretical assumptions formed on the basis of the state-of-the-art literature review. To this end, 15 international academic experts (out of 18 initially selected) with affiliation from ten countries (five continents), were individually interviewed. The standardized set of 19 open-ended questions was used for all the interviewees, who had been selected based on their expertise and reputation in the fields of place branding/marketing. Findings were categorized into four modules: (i) conceptualizations of city brand effectiveness, (ii) methodological issues of city brand effectiveness measurement, (iii) the nature of measurement process, (iv) articulation of key performance indicators (KPIs). Within each module, the interviewees offered diverse insights into the subject based on their academic expertise and professional activity as consultants. They proposed that there should be a twofold understanding of effectiveness. The narrow one when it is conceived as the aptitude to achieve specific goals, and the broad one in which city brand effectiveness is seen as an increase in social and economic reality of a place, which in turn poses diverse challenges for the measurement concepts and processes. Moreover, the respondents offered a variety of insights into the methodological issues, particularly about the need for customization and flexibility of the measurement systems, for the employment of interdisciplinary approach to measurement and implications resulting therefrom. Considerable emphasis was put on the inward approach to measurement, namely the necessity to monitor the resident’s evaluation of brand related activities instead of benchmarking cities against the competitive set. Other findings encompass the issues of developing appropriate KPIs for the city brand, managing the measurement process and the inclusion of diverse stakeholders to produce a sound measurement system. Furthermore, the interviewees enumerated the most frequently made mistakes in measurement mainly resulting from the misunderstanding of the nature of city brands. This research was financed by the National Science Centre, Poland, research project no. 2015/19/B/HS4/00380 Towards the categorization of place brand strategy effectiveness indicators – findings from strategic documents of Polish district cities – theoretical and empirical approach.Keywords: city branding, effectiveness, experts’ insights, measurement
Procedia PDF Downloads 1495030 Performance Analysis of Arithmetic Units for IoT Applications
Authors: Nithiya C., Komathi B. J., Praveena N. G., Samuda Prathima
Abstract:
At present, the ultimate aim in digital system designs, especially at the gate level and lower levels of design abstraction, is power optimization. Adders are a nearly universal component of today's integrated circuits. Most of the research was on the design of high-speed adders to execute addition based on various adder structures. This paper discusses the ideal path for selecting an arithmetic unit for IoT applications. Based on the analysis of eight types of 16-bit adders, we found out Carry Look-ahead (CLA) produces low power. Additionally, multiplier and accumulator (MAC) unit is implemented with the Booth multiplier by using the low power adders in the order of preference. The design is synthesized and verified using Synopsys Design Compiler and VCS. Then it is implemented by using Cadence Encounter. The total power consumed by the CLA based booth multiplier is 0.03527mW, the total area occupied is 11260 um², and the speed is 2034 ps.Keywords: carry look-ahead, carry select adder, CSA, internet of things, ripple carry adder, design rule check, power delay product, multiplier and accumulator
Procedia PDF Downloads 1205029 Investigation of Physical Properties of Asphalt Binder Modified by Recycled Polyethylene and Ground Tire Rubber
Authors: Sajjad H. Kasanagh, Perviz Ahmedzade, Alexander Fainleib, Taylan Gunay
Abstract:
Modification of asphalt is a fundamental method around the world mainly on the purpose of providing more durable pavements which lead to diminish repairing cost during the lifetime of highways. Various polymers such as styrene-butadiene-styrene (SBS) and ethylene vinyl acetate (EVA) make up the greater parts of the all-over asphalt modifiers generally providing better physical properties of asphalt by decreasing temperature dependency which eventually diminishes permanent deformation on highways such as rutting. However, some waste and low-cost materials such as recycled plastics and ground rubber tire have been attempted to utilize in asphalt as modifier instead of manufactured polymer modifiers due to decreasing the eventual highway cost. On the other hand, the usage of recycled plastics has become a worldwide requirement and awareness in order to decrease the pollution made by waste plastics. Hence, finding an area in which recycling plastics could be utilized has been targeted by many research teams so as to reduce polymer manufacturing and plastic pollution. To this end, in this paper, thermoplastic dynamic vulcanizate (TDV) obtained from recycled post-consumer polyethylene and ground tire rubber (GTR) were used to provide an efficient modifier for asphalt which decreases the production cost as well and finally might provide an ecological solution by decreasing polymer disposal problems. TDV was synthesized by the chemists in the research group by means of the abovementioned components that are considered as compatible physical characteristic of asphalt materials. TDV modified asphalt samples having different rate of proportions of 3, 4, 5, 6, 7 wt.% TDV modifier were prepared. Conventional tests, such as penetration, softening point and roll thin film oven (RTFO) tests were performed to obtain fundamental physical and aging properties of the base and modified binders. The high temperature performance grade (PG) of binders was determined by Superpave tests conducted on original and aged binders. The multiple stress creep and recovery (MSCR) test which is relatively up-to-date method for classifying asphalts taking account of their elasticity abilities was carried out to evaluate PG plus grades of binders. The results obtained from performance grading, and MSCR tests were also evaluated together so as to make a comparison between the methods both aiming to determine rheological parameters of asphalt. The test results revealed that TDV modification leads to a decrease in penetration, an increase in softening point, which proves an increasing stiffness of asphalt. DSR results indicate an improvement in PG for modified binders compared to base asphalt. On the other hand, MSCR results that are compatible with DSR results also indicate an enhancement on rheological properties of asphalt. However, according to the results, the improvement is not as distinct as observed in DSR results since elastic properties are fundamental in MSCR. At the end of the testing program, it can be concluded that TDV can be used as modifier which provides better rheological properties for asphalt and might diminish plastic waste pollution since the material is 100% recycled.Keywords: asphalt, ground tire rubber, recycled polymer, thermoplastic dynamic vulcanizate
Procedia PDF Downloads 2225028 Forecasting Stock Prices Based on the Residual Income Valuation Model: Evidence from a Time-Series Approach
Authors: Chen-Yin Kuo, Yung-Hsin Lee
Abstract:
Previous studies applying residual income valuation (RIV) model generally use panel data and single-equation model to forecast stock prices. Unlike these, this paper uses Taiwan longitudinal data to estimate multi-equation time-series models such as Vector Autoregressive (VAR), Vector Error Correction Model (VECM), and conduct out-of-sample forecasting. Further, this work assesses their forecasting performance by two instruments. In favor of extant research, the major finding shows that VECM outperforms other three models in forecasting for three stock sectors over entire horizons. It implies that an error correction term containing long-run information contributes to improve forecasting accuracy. Moreover, the pattern of composite shows that at longer horizon, VECM produces the greater reduction in errors, and performs substantially better than VAR.Keywords: residual income valuation model, vector error correction model, out of sample forecasting, forecasting accuracy
Procedia PDF Downloads 3215027 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs
Procedia PDF Downloads 1645026 Thermal Performance Analysis of Nanofluids in a Concetric Heat Exchanger Equipped with Turbulators
Authors: Feyza Eda Akyurek, Bayram Sahin, Kadir Gelis, Eyuphan Manay, Murat Ceylan
Abstract:
Turbulent forced convection heat transfer and pressure drop characteristics of Al2O3–water nanofluid flowing through a concentric tube heat exchanger with and without coiled wire turbulators were studied experimentally. The experiments were conducted in the Reynolds number ranging from 4000 to 20000, particle volume concentrations of 0.8 vol.% and 1.6 vol.%. Two turbulators with the pitches of 25 mm and 39 mm were used. The results of nanofluids indicated that average Nusselt number increased much more with increasing Reynolds number compared to that of pure water. Thermal conductivity enhancement by the nanofluids resulted in heat transfer enhancement. Once the pressure drop of the alumina/water nanofluid was analyzed, it was nearly equal to that of pure water at the same Reynolds number range. It was concluded that nanofluids with the volume fractions of 0.8 and 1.6 did not have a significant effect on pressure drop change. However, the use of wire coils in heat exchanger enhanced heat transfer as well as the pressure drop.Keywords: turbulators, heat exchanger, nanofluids, heat transfer enhancement
Procedia PDF Downloads 411