Search results for: fuzzy multiple objective programming
11692 Peak Data Rate Enhancement Using Switched Micro-Macro Diversity in Cellular Multiple-Input-Multiple-Output Systems
Authors: Jihad S. Daba, J. P. Dubois, Yvette Antar
Abstract:
With the exponential growth of cellular users, a new generation of cellular networks is needed to enhance the required peak data rates. The co-channel interference between neighboring base stations inhibits peak data rate increase. To overcome this interference, multi-cell cooperation known as coordinated multipoint transmission is proposed. Such a solution makes use of multiple-input-multiple-output (MIMO) systems under two different structures: Micro- and macro-diversity. In this paper, we study the capacity and bit error rate in cellular networks using MIMO technology. We analyse both micro- and macro-diversity schemes and develop a hybrid model that switches between macro- and micro-diversity in the case of hard handoff based on a cut-off range of signal-to-noise ratio values. We conclude that our hybrid switched micro-macro MIMO system outperforms classical MIMO systems at the cost of increased hardware and software complexity.Keywords: cooperative multipoint transmission, ergodic capacity, hard handoff, macro-diversity, micro-diversity, multiple-input-multiple output systems, orthogonal frequency division multiplexing
Procedia PDF Downloads 31411691 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments
Authors: Ana Londral, Burcu Demiray, Marcus Cheetham
Abstract:
Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation
Procedia PDF Downloads 28211690 God, The Master Programmer: The Relationship Between God and Computers
Authors: Mohammad Sabbagh
Abstract:
Anyone who reads the Torah or the Quran learns that GOD created everything that is around us, seen and unseen, in six days. Within HIS plan of creation, HE placed for us a key proof of HIS existence which is essentially computers and the ability to program them. Digital computer programming began with binary instructions, which eventually evolved to what is known as high-level programming languages. Any programmer in our modern time can attest that you are essentially giving the computer commands by words and when the program is compiled, whatever is processed as output is limited to what the computer was given as an ability and furthermore as an instruction. So one can deduce that GOD created everything around us with HIS words, programming everything around in six days, just like how we can program a virtual world on the computer. GOD did mention in the Quran that one day where GOD’s throne is, is 1000 years of what we count; therefore, one might understand that GOD spoke non-stop for 6000 years of what we count, and gave everything it’s the function, attributes, class, methods and interactions. Similar to what we do in object-oriented programming. Of course, GOD has the higher example, and what HE created is much more than OOP. So when GOD said that everything is already predetermined, it is because any input, whether physical, spiritual or by thought, is outputted by any of HIS creatures, the answer has already been programmed. Any path, any thought, any idea has already been laid out with a reaction to any decision an inputter makes. Exalted is GOD!. GOD refers to HIMSELF as The Fastest Accountant in The Quran; the Arabic word that was used is close to processor or calculator. If you create a 3D simulation of a supernova explosion to understand how GOD produces certain elements and fuses protons together to spread more of HIS blessings around HIS skies; in 2022 you are going to require one of the strongest, fastest, most capable supercomputers of the world that has a theoretical speed of 50 petaFLOPS to accomplish that. In other words, the ability to perform one quadrillion (1015) floating-point operations per second. A number a human cannot even fathom. To put in more of a perspective, GOD is calculating when the computer is going through those 50 petaFLOPS calculations per second and HE is also calculating all the physics of every atom and what is smaller than that in all the actual explosion, and it’s all in truth. When GOD said HE created the world in truth, one of the meanings a person can understand is that when certain things occur around you, whether how a car crashes or how a tree grows; there is a science and a way to understand it, and whatever programming or science you deduce from whatever event you observed, it can relate to other similar events. That is why GOD might have said in The Quran that it is the people of knowledge, scholars, or scientist that fears GOD the most! One thing that is essential for us to keep up with what the computer is doing and for us to track our progress along with any errors is we incorporate logging mechanisms and backups. GOD in The Quran said that ‘WE used to copy what you used to do’. Essentially as the world is running, think of it as an interactive movie that is being played out in front of you, in a full-immersive non-virtual reality setting. GOD is recording it, from every angle to every thought, to every action. This brings the idea of how scary the Day of Judgment will be when one might realize that it’s going to be a fully immersive video when we would be getting and reading our book.Keywords: programming, the Quran, object orientation, computers and humans, GOD
Procedia PDF Downloads 10711689 Modeling of Age Hardening Process Using Adaptive Neuro-Fuzzy Inference System: Results from Aluminum Alloy A356/Cow Horn Particulate Composite
Authors: Chidozie C. Nwobi-Okoye, Basil Q. Ochieze, Stanley Okiy
Abstract:
This research reports on the modeling of age hardening process using adaptive neuro-fuzzy inference system (ANFIS). The age hardening output (Hardness) was predicted using ANFIS. The input parameters were ageing time, temperature and percentage composition of cow horn particles (CHp%). The results show the correlation coefficient (R) of the predicted hardness values versus the measured values was of 0.9985. Subsequently, values outside the experimental data points were predicted. When the temperature was kept constant, and other input parameters were varied, the average relative error of the predicted values was 0.0931%. When the temperature was varied, and other input parameters kept constant, the average relative error of the hardness values predictions was 80%. The results show that ANFIS with coarse experimental data points for learning is not very effective in predicting process outputs in the age hardening operation of A356 alloy/CHp particulate composite. The fine experimental data requirements by ANFIS make it more expensive in modeling and optimization of age hardening operations of A356 alloy/CHp particulate composite.Keywords: adaptive neuro-fuzzy inference system (ANFIS), age hardening, aluminum alloy, metal matrix composite
Procedia PDF Downloads 15511688 How to Enhance Performance of Universities by Implementing Balanced Scorecard with Using FDM and ANP
Authors: Neda Jalaliyoon, Nooh Abu Bakar, Hamed Taherdoost
Abstract:
The present research recommended balanced scorecard (BSC) framework to appraise the performance of the universities. As the original model of balanced scorecard has four perspectives in order to implement BSC in present research the same model with “financial perspective”, “customer”,” internal process” and “learning and growth” is used as well. With applying fuzzy Delphi method (FDM) and questionnaire sixteen measures of performance were identified. Moreover, with using the analytic network process (ANP) the weights of the selected indicators were determined. Results indicated that the most important BSC’s aspect were Internal Process (0.3149), Customer (0.2769), Learning and Growth (0.2049), and Financial (0.2033) respectively. The proposed BSC framework can help universities to enhance their efficiency in competitive environment.Keywords: balanced scorecard, higher education, fuzzy delphi method, analytic network process (ANP)
Procedia PDF Downloads 42811687 Risk Assessment of Building Information Modelling Adoption in Construction Projects
Authors: Amirhossein Karamoozian, Desheng Wu, Behzad Abbasnejad
Abstract:
Building information modelling (BIM) is a new technology to enhance the efficiency of project management in the construction industry. In addition to the potential benefits of this useful technology, there are various risks and obstacles to applying it in construction projects. In this study, a decision making approach is presented for risk assessment in BIM adoption in construction projects. Various risk factors of exerting BIM during different phases of the project lifecycle are identified with the help of Delphi method, experts’ opinions and related literature. Afterward, Shannon’s entropy and Fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Situation) are applied to derive priorities of the identified risk factors. Results indicated that lack of knowledge between professional engineers about workflows in BIM and conflict of opinions between different stakeholders are the risk factors with the highest priority.Keywords: risk, BIM, fuzzy TOPSIS, construction projects
Procedia PDF Downloads 23111686 The Task-Centered Instructional Strategy to Prepare Teachers for Integrating Robotics Activities in Science Education
Authors: Doaa Saad, Igor Verner, Rinat B. Rosenberg-Kima
Abstract:
This case study demonstrates how the Task-Centered Instructional Strategy can be used to develop robotics competencies in middle-school science teachers without programming knowledge, thereby reducing their anxiety about robotics. Sixteen middle school science teachers participated in a teachers’ professional development program. The strategy combines the progression of real-world tasks with explicit instruction that serves as the backbone of instruction. The designed progression includes three tasks that integrate building and programming robots, pedagogy, and science knowledge, with an increasing level of complexity and decreasing level of support. We used EV3 LEGO kits and programming blocks, a new technology for most of the participating teachers. Pre-post questionnaires were used to examine teachers’ anxiety in performing robotics tasks before the program began and after the program ended. In addition, post-program questionnaires were used to obtain teachers’ feedback on the program’s overall quality. The case study results showed that teachers were less anxious about performing robotics tasks after the program and were highly satisfied with the professional development program. Overall, our research findings indicate a positive effect of the Task-Centered Instructional Strategy for preparing in-service science teachers to integrate robotics activities into their science classes.Keywords: competencies, educational robotics, task-centered instructional strategy, teachers’ professional development
Procedia PDF Downloads 8711685 Predicting Trapezoidal Weir Discharge Coefficient Using Evolutionary Algorithm
Authors: K. Roushanger, A. Soleymanzadeh
Abstract:
Weirs are structures often used in irrigation techniques, sewer networks and flood protection. However, the hydraulic behavior of this type of weir is complex and difficult to predict accurately. An accurate flow prediction over a weir mainly depends on the proper estimation of discharge coefficient. In this study, the Genetic Expression Programming (GEP) approach was used for predicting trapezoidal and rectangular sharp-crested side weirs discharge coefficient. Three different performance indexes are used as comparing criteria for the evaluation of the model’s performances. The obtained results approved capability of GEP in prediction of trapezoidal and rectangular side weirs discharge coefficient. The results also revealed the influence of downstream Froude number for trapezoidal weir and upstream Froude number for rectangular weir in prediction of the discharge coefficient for both of side weirs.Keywords: discharge coefficient, genetic expression programming, trapezoidal weir
Procedia PDF Downloads 38811684 Identification and Origins of Multiple Personality: A Criterion from Wiggins
Authors: Brittany L. Kang
Abstract:
One familiar theory of the origin of multiple personalities focuses on how symptoms of trauma or abuse are central causes, as seen in paradigmatic examples of the condition. The theory states that multiple personalities constitute a congenital condition, as babies all exhibit multiplicity, and that generally alters only remain separated due to trauma. In more typical cases, the alters converge and become a single identity; only in cases of trauma, according to this account, do the alters remain separated. This theory is misleading in many aspects, the most prominent being that not all multiple personality patients are victims of child abuse or trauma, nor are all cases of multiple personality observed in early childhood. The use of this criterion also causes clinical problems, including an inability to identify multiple personalities through the variety of symptoms and traits seen across observed cases. These issues present a need for revision in the currently applied criterion in order to separate the notion of child abuse and to be able to better understand the origins of multiple personalities itself. Identifying multiplicity through the application of identity theories will improve the current criterion, offering a bridge between identifying existing cases and understanding their origins. We begin by applying arguments from Wiggins, who held that each personality within a multiple was not a whole individual, but rather characters who switch off. Wiggins’ theory is supported by observational evidence of how such characters are differentiated. Alters of older ages are seen to require different prescription lens, in addition to having different handwriting. The alters may also display drastically varying styles of clothing, preferences in food, their gender, sexuality, religious beliefs and more. The definitions of terms such as 'personality' or 'persons' also become more distinguished, leading to greater understanding of who is exactly able to be classified as a patient of multiple personalities. While a more common meaning of personality is a designation of specific characteristics which account for the entirety of a person, this paper argues from Wiggins’ theory that each 'personality' is in fact only partial. Clarification of the concept in question will allow for more successful future clinical applications.Keywords: identification, multiple personalities, origin, Wiggins' theory
Procedia PDF Downloads 24211683 Optimal Tamping for Railway Tracks, Reducing Railway Maintenance Expenditures by the Use of Integer Programming
Authors: Rui Li, Min Wen, Kim Bang Salling
Abstract:
For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euros per kilometer per year. In order to reduce such maintenance expenditures, this paper presents a mixed 0-1 linear mathematical model designed to optimize the predictive railway tamping activities for ballast track in the planning horizon of three to four years. The objective function is to minimize the tamping machine actual costs. The approach of the research is using the simple dynamic model for modelling condition-based tamping process and the solution method for finding optimal condition-based tamping schedule. Seven technical and practical aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality recovery on the track quality after tamping operation; (5) Tamping machine operation practices (6) tamping budgets and (7) differentiating the open track from the station sections. A Danish railway track between Odense and Fredericia with 42.6 km of length is applied for a time period of three and four years in the proposed maintenance model. The generated tamping schedule is reasonable and robust. Based on the result from the Danish railway corridor, the total costs can be reduced significantly (50%) than the previous model which is based on optimizing the number of tamping. The different maintenance strategies have been discussed in the paper. The analysis from the results obtained from the model also shows a longer period of predictive tamping planning has more optimal scheduling of maintenance actions than continuous short term preventive maintenance, namely yearly condition-based planning.Keywords: integer programming, railway tamping, predictive maintenance model, preventive condition-based maintenance
Procedia PDF Downloads 44711682 Control of Base Isolated Benchmark using Combined Control Strategy with Fuzzy Algorithm Subjected to Near-Field Earthquakes
Authors: Hashem Shariatmadar, Mozhgansadat Momtazdargahi
Abstract:
The purpose of control structure against earthquake is to dissipate earthquake input energy to the structure and reduce the plastic deformation of structural members. There are different methods for control structure against earthquake to reduce the structure response that they are active, semi-active, inactive and hybrid. In this paper two different combined control systems are used first system comprises base isolator and multi tuned mass dampers (BI & MTMD) and another combination is hybrid base isolator and multi tuned mass dampers (HBI & MTMD) for controlling an eight story isolated benchmark steel structure. Active control force of hybrid isolator is estimated by fuzzy logic algorithms. The influences of the combined systems on the responses of the benchmark structure under the two near-field earthquake (Newhall & Elcentro) are evaluated by nonlinear dynamic time history analysis. Applications of combined control systems consisting of passive or active systems installed in parallel to base-isolation bearings have the capability of reducing response quantities of base-isolated (relative and absolute displacement) structures significantly. Therefore in design and control of irregular isolated structures using the proposed control systems, structural demands (relative and absolute displacement and etc.) in each direction must be considered separately.Keywords: base-isolated benchmark structure, multi-tuned mass dampers, hybrid isolators, near-field earthquake, fuzzy algorithm
Procedia PDF Downloads 30511681 A Robust Software for Advanced Analysis of Space Steel Frames
Authors: Viet-Hung Truong, Seung-Eock Kim
Abstract:
This paper presents a robust software package for practical advanced analysis of space steel framed structures. The pre- and post-processors of the presented software package are coded in the C++ programming language while the solver is written by using the FORTRAN programming language. A user-friendly graphical interface of the presented software is developed to facilitate the modeling process and result interpretation of the problem. The solver employs the stability functions for capturing the second-order effects to minimize modeling and computational time. Both the plastic-hinge and fiber-hinge beam-column elements are available in the presented software. The generalized displacement control method is adopted to solve the nonlinear equilibrium equations.Keywords: advanced analysis, beam-column, fiber-hinge, plastic hinge, steel frame
Procedia PDF Downloads 30811680 Reduction of Multiple User Interference for Optical CDMA Systems Using Successive Interference Cancellation Scheme
Authors: Tawfig Eltaif, Hesham A. Bakarman, N. Alsowaidi, M. R. Mokhtar, Malek Harbawi
Abstract:
In Commonly, it is primary problem that there is multiple user interference (MUI) noise resulting from the overlapping among the users in optical code-division multiple access (OCDMA) system. In this article, we aim to mitigate this problem by studying an interference cancellation scheme called successive interference cancellation (SIC) scheme. This scheme will be tested on two different detection schemes, spectral amplitude coding (SAC) and direct detection systems (DS), using partial modified prime (PMP) as the signature codes. It was found that SIC scheme based on both SAC and DS methods had a potential to suppress the intensity noise, that is to say, it can mitigate MUI noise. Furthermore, SIC/DS scheme showed much lower bit error rate (BER) performance relative to SIC/SAC scheme for different magnitude of effective power. Hence, many more users can be supported by SIC/DS receiver system.Keywords: optical code-division multiple access (OCDMA), successive interference cancellation (SIC), multiple user interference (MUI), spectral amplitude coding (SAC), partial modified prime code (PMP)
Procedia PDF Downloads 52111679 Seismic Bearing Capacity Estimation of Shallow Foundations on Dense Sand Underlain by Loose Sand Strata by Using Finite Elements Limit Analysis
Authors: Pragyan Paramita Das, Vishwas N. Khatri
Abstract:
By using the lower- and upper- bound finite elements to limit analysis in conjunction with second-order conic programming (SOCP), the effect of seismic forces on the bearing capacity of surface strip footing resting on dense sand underlain by loose sand deposit is explored. The soil is assumed to obey the Mohr-Coulomb’s yield criterion and an associated flow rule. The angle of internal friction (ϕ) of the top and the bottom layer is varied from 42° to 44° and 32° to 34° respectively. The coefficient of seismic acceleration is varied from 0 to 0.3. The variation of bearing capacity with different thickness of top layer for various seismic acceleration coefficients is generated. A comparison will be made with the available solutions from literature wherever applicable.Keywords: bearing capacity, conic programming, finite elements, seismic forces
Procedia PDF Downloads 17011678 Variable Tree Structure QR Decomposition-M Algorithm (QRD-M) in Multiple Input Multiple Output-Orthogonal Frequency Division Multiplexing (MIMO-OFDM) Systems
Authors: Jae-Hyun Ro, Jong-Kwang Kim, Chang-Hee Kang, Hyoung-Kyu Song
Abstract:
In multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) systems, QR decomposition-M algorithm (QRD-M) has suboptimal error performance. However, the QRD-M has still high complexity due to many calculations at each layer in tree structure. To reduce the complexity of the QRD-M, proposed QRD-M modifies existing tree structure by eliminating unnecessary candidates at almost whole layers. The method of the elimination is discarding the candidates which have accumulated squared Euclidean distances larger than calculated threshold. The simulation results show that the proposed QRD-M has same bit error rate (BER) performance with lower complexity than the conventional QRD-M.Keywords: complexity, MIMO-OFDM, QRD-M, squared Euclidean distance
Procedia PDF Downloads 33511677 Fast Short-Term Electrical Load Forecasting under High Meteorological Variability with a Multiple Equation Time Series Approach
Authors: Charline David, Alexandre Blondin Massé, Arnaud Zinflou
Abstract:
In 2016, Clements, Hurn, and Li proposed a multiple equation time series approach for the short-term load forecasting, reporting an average mean absolute percentage error (MAPE) of 1.36% on an 11-years dataset for the Queensland region in Australia. We present an adaptation of their model to the electrical power load consumption for the whole Quebec province in Canada. More precisely, we take into account two additional meteorological variables — cloudiness and wind speed — on top of temperature, as well as the use of multiple meteorological measurements taken at different locations on the territory. We also consider other minor improvements. Our final model shows an average MAPE score of 1:79% over an 8-years dataset.Keywords: short-term load forecasting, special days, time series, multiple equations, parallelization, clustering
Procedia PDF Downloads 10411676 Preparation and Evaluation of Multiple Unit Tablets of Aceclofenac
Authors: Vipin Saini, Sunil Kamboj, Suman Bala, A. Pandurangan
Abstract:
The present research is aimed at fabrication of multiple-unit controlled-release tablet formulation of aceclofenac by employing acrylic polymers as the release controlling excipients for drug multi-particulates to achieve the desired objectives of maintaining the same controlled release characteristics as that prior to their compression into tablet. Various manufacturers are successfully manufacturing and marketing aceclofenac controlled release tablet by applying directly coating materials on the tablet. The basic idea behind development of such formulations was to employ aqueous acrylics polymers dispersion as an alternative to the existing approaches, wherein the forces of compression may cause twist of drug pellets, but do not have adverse effects on the drug release properties. Thus, the study was undertaken to illustrate manufacturing of controlled release aceclofenac multiple-unit tablet formulation.Keywords: aceclofenac, multiple-unit tablets, acrylic polymers, controlled-release
Procedia PDF Downloads 44211675 Review of Theories and Applications of Genetic Programing in Sediment Yield Modeling
Authors: Adesoji Tunbosun Jaiyeola, Josiah Adeyemo
Abstract:
Sediment yield can be considered to be the total sediment load that leaves a drainage basin. The knowledge of the quantity of sediments present in a river at a particular time can lead to better flood capacity in reservoirs and consequently help to control over-bane flooding. Furthermore, as sediment accumulates in the reservoir, it gradually loses its ability to store water for the purposes for which it was built. The development of hydrological models to forecast the quantity of sediment present in a reservoir helps planners and managers of water resources systems, to understand the system better in terms of its problems and alternative ways to address them. The application of artificial intelligence models and technique to such real-life situations have proven to be an effective approach of solving complex problems. This paper makes an extensive review of literature relevant to the theories and applications of evolutionary algorithms, and most especially genetic programming. The successful applications of genetic programming as a soft computing technique were reviewed in sediment modelling and other branches of knowledge. Some fundamental issues such as benchmark, generalization ability, bloat and over-fitting and other open issues relating to the working principles of GP, which needs to be addressed by the GP community were also highlighted. This review aim to give GP theoreticians, researchers and the general community of GP enough research direction, valuable guide and also keep all stakeholders abreast of the issues which need attention during the next decade for the advancement of GP.Keywords: benchmark, bloat, generalization, genetic programming, over-fitting, sediment yield
Procedia PDF Downloads 44811674 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques
Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba
Abstract:
The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry
Procedia PDF Downloads 19211673 Developing a Spatial Decision Support System for Rationality Assessment of Land Use Planning Locations in Thai Binh Province, Vietnam
Authors: Xuan Linh Nguyen, Tien Yin Chou, Yao Min Fang, Feng Cheng Lin, Thanh Van Hoang, Yin Min Huang
Abstract:
In Vietnam, land use planning is the most important and powerful tool of the government for sustainable land use and land management. Nevertheless, many of land use planning locations are facing protests from surrounding households due to environmental impacts. In addition, locations are planned completely based on the subjective decisions of planners who are unsupported by tools or scientific methods. Hence, this research aims to assist the decision-makers in evaluating the rationality of planning locations by developing a Spatial Decision Support System (SDSS) using approaches of Geographic Information System (GIS)-based technology, Analytic Hierarchy Process (AHP) multi-criteria-based technique and Fuzzy set theory. An ArcGIS Desktop add-ins named SDSS-LUPA was developed to support users analyzing data and presenting results in friendly format. The Fuzzy-AHP method has been utilized as analytic model for this SDSS. There are 18 planned locations in Hung Ha district (Thai Binh province, Vietnam) as a case study. The experimental results indicated that the assessment threshold higher than 0.65 while the 18 planned locations were irrational because of close to residential areas or close to water sources. Some potential sites were also proposed to the authorities for consideration of land use planning changes.Keywords: analytic hierarchy process, fuzzy set theory, land use planning, spatial decision support system
Procedia PDF Downloads 38111672 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller
Authors: Alireza Dantism
Abstract:
Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller
Procedia PDF Downloads 9811671 Objective vs. Perceived Quality in the Cereal Industry
Authors: Albena Ivanova, Jill Kurp, Austin Hampe
Abstract:
Cereal products in the US contain rich information on the front of the package (FOP) as well as point-of-purchase (POP) summaries provided by the store. These summaries frequently are confusing and misleading to the consumer. This study explores the relationship between perceived quality, objective quality, price, and value in the cold cereal industry. A total of 270 cold cereal products were analyzed and the price, quality and value for different summaries were compared using ANOVA tests. The results provide evidence that the United States Department of Agriculture Organic FOP/POP are related to higher objective quality, higher price, but not to a higher value. Whole grain FOP/POP related to a higher objective quality, lower or similar price, and higher value. Heart-healthy POP related to higher objective quality, similar price, and higher value. Gluten-free FOP/POP related to lower objective quality, higher price, and lower value. Kid's cereals were of lower objective quality, same price, and lower value compared to family and adult markets. The findings point to a disturbing tendency of companies to continue to produce lower quality products for the kids’ market, pricing them the same as high-quality products. The paper outlines strategies that marketers and policymakers can utilize to contribute to the increased objective quality and value of breakfast cereal products in the United States.Keywords: cereals, certifications, front-of-package claims, consumer health.
Procedia PDF Downloads 12511670 Multi-Objective Variable Neighborhood Search Algorithm to Solving Scheduling Problem with Transportation Times
Authors: Majid Khalili
Abstract:
This paper deals with a bi-objective hybrid no-wait flowshop scheduling problem minimizing the makespan and total weighted tardiness, in which we consider transportation times between stages. Obtaining an optimal solution for this type of complex, large-sized problem in reasonable computational time by using traditional approaches and optimization tools is extremely difficult. This paper presents a new multi-objective variable neighborhood algorithm (MOVNS). A set of experimental instances are carried out to evaluate the algorithm by advanced multi-objective performance measures. The algorithm is carefully evaluated for its performance against available algorithm by means of multi-objective performance measures and statistical tools. The related results show that a variant of our proposed MOVNS provides sound performance comparing with other algorithms. Procedia PDF Downloads 41811669 Faults Diagnosis by Thresholding and Decision tree with Neuro-Fuzzy System
Authors: Y. Kourd, D. Lefebvre
Abstract:
The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. This paper proposes a method of fault diagnosis based on a neuro-fuzzy hybrid structure. This hybrid structure combines the selection of threshold and decision tree. The validation of this method is obtained with the DAMADICS benchmark. In the first phase of the method, a model will be constructed that represents the normal state of the system to fault detection. Signatures of the faults are obtained with residuals analysis and selection of appropriate thresholds. These signatures provide groups of non-separable faults. In the second phase, we build faulty models to see the flaws in the system that cannot be isolated in the first phase. In the latest phase we construct the tree that isolates these faults.Keywords: decision tree, residuals analysis, ANFIS, fault diagnosis
Procedia PDF Downloads 62811668 Deep Reinforcement Learning-Based Computation Offloading for 5G Vehicle-Aware Multi-Access Edge Computing Network
Authors: Ziying Wu, Danfeng Yan
Abstract:
Multi-Access Edge Computing (MEC) is one of the key technologies of the future 5G network. By deploying edge computing centers at the edge of wireless access network, the computation tasks can be offloaded to edge servers rather than the remote cloud server to meet the requirements of 5G low-latency and high-reliability application scenarios. Meanwhile, with the development of IOV (Internet of Vehicles) technology, various delay-sensitive and compute-intensive in-vehicle applications continue to appear. Compared with traditional internet business, these computation tasks have higher processing priority and lower delay requirements. In this paper, we design a 5G-based Vehicle-Aware Multi-Access Edge Computing Network (VAMECN) and propose a joint optimization problem of minimizing total system cost. In view of the problem, a deep reinforcement learning-based joint computation offloading and task migration optimization (JCOTM) algorithm is proposed, considering the influences of multiple factors such as concurrent multiple computation tasks, system computing resources distribution, and network communication bandwidth. And, the mixed integer nonlinear programming problem is described as a Markov Decision Process. Experiments show that our proposed algorithm can effectively reduce task processing delay and equipment energy consumption, optimize computing offloading and resource allocation schemes, and improve system resource utilization, compared with other computing offloading policies.Keywords: multi-access edge computing, computation offloading, 5th generation, vehicle-aware, deep reinforcement learning, deep q-network
Procedia PDF Downloads 12011667 A Linear Regression Model for Estimating Anxiety Index Using Wide Area Frontal Lobe Brain Blood Volume
Authors: Takashi Kaburagi, Masashi Takenaka, Yosuke Kurihara, Takashi Matsumoto
Abstract:
Major depressive disorder (MDD) is one of the most common mental illnesses today. It is believed to be caused by a combination of several factors, including stress. Stress can be quantitatively evaluated using the State-Trait Anxiety Inventory (STAI), one of the best indices to evaluate anxiety. Although STAI scores are widely used in applications ranging from clinical diagnosis to basic research, the scores are calculated based on a self-reported questionnaire. An objective evaluation is required because the subject may intentionally change his/her answers if multiple tests are carried out. In this article, we present a modified index called the “multi-channel Laterality Index at Rest (mc-LIR)” by recording the brain activity from a wider area of the frontal lobe using multi-channel functional near-infrared spectroscopy (fNIRS). The presented index aims to measure multiple positions near the Fpz defined by the international 10-20 system positioning. Using 24 subjects, the dependencies on the number of measuring points used to calculate the mc-LIR and its correlation coefficients with the STAI scores are reported. Furthermore, a simple linear regression was performed to estimate the STAI scores from mc-LIR. The cross-validation error is also reported. The experimental results show that using multiple positions near the Fpz will improve the correlation coefficients and estimation than those using only two positions.Keywords: frontal lobe, functional near-infrared spectroscopy, state-trait anxiety inventory score, stress
Procedia PDF Downloads 25111666 Analysis and Design of Dual-Polarization Antennas for Wireless Communication Systems
Authors: Vladimir Veremey
Abstract:
The paper describes the design and simulation of dual-polarization antennas that use the resonance and radiating properties of the H00 mode of metal open waveguides. The proposed antennas are formed by two orthogonal slots in a finite conducting ground plane. The slots are backed by metal screens connected to the ground plane forming open waveguides. It has been shown that the antenna designs can be efficiently used in mm-wave bands. The antenna single mode operational bandwidth is higher than 10%. The antenna designs are very simple and low-cost. They allow flush installation and can be efficiently used in various communication and remote sensing devices on fast moving carriers. Mutual coupling between antennas of the proposed design is very low. Thus, multiple antenna structures with proposed antennas can be efficiently employed in multi-band and in multiple-input-multiple-output (MIMO) systems.Keywords: antenna, antenna arrays, Multiple-Input-Multiple-Output (MIMO), millimeter wave bands, slot antenna, flush installation, directivity, open waveguide, conformal antennas
Procedia PDF Downloads 17011665 Ramification of Oil Prices on Renewable Energy Deployment
Authors: Osamah A. Alsayegh
Abstract:
This paper contributes to the literature by updating the analysis of the impact of the recent oil prices fall on the renewable energy (RE) industry and deployment. The research analysis uses the Renewable Energy Industrial Index (RENIXX), which tracks the world’s 30 largest publicly traded companies and oil prices daily data from January 2003 to March 2016. RENIXX represents RE industries developing solar, wind, geothermal, bioenergy, hydropower and fuel cells technologies. This paper tests the hypothesis that claims high oil prices encourage the substitution of alternate energy sources for conventional energy sources. Furthermore, it discusses RENIXX performance behavior with respect to the governments’ policies factor that investors should take into account. Moreover, the paper proposes a theoretical model that relates RE industry progress with oil prices and policies through the fuzzy logic system.Keywords: Fuzzy logic, investment, policy, stock exchange index
Procedia PDF Downloads 23911664 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema
Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy
Abstract:
Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet
Procedia PDF Downloads 31311663 Optimization of Economic Order Quantity of Multi-Item Inventory Control Problem through Nonlinear Programming Technique
Authors: Prabha Rohatgi
Abstract:
To obtain an efficient control over a huge amount of inventory of drugs in pharmacy department of any hospital, generally, the medicines are categorized on the basis of their cost ‘ABC’ (Always Better Control), first and then categorize on the basis of their criticality ‘VED’ (Vital, Essential, desirable) for prioritization. About one-third of the annual expenditure of a hospital is spent on medicines. To minimize the inventory investment, the hospital management may like to keep the medicines inventory low, as medicines are perishable items. The main aim of each and every hospital is to provide better services to the patients under certain limited resources. To achieve the satisfactory level of health care services to outdoor patients, a hospital has to keep eye on the wastage of medicines because expiry date of medicines causes a great loss of money though it was limited and allocated for a particular period of time. The objectives of this study are to identify the categories of medicines requiring incentive managerial control. In this paper, to minimize the total inventory cost and the cost associated with the wastage of money due to expiry of medicines, an inventory control model is used as an estimation tool and then nonlinear programming technique is used under limited budget and fixed number of orders to be placed in a limited time period. Numerical computations have been given and shown that by using scientific methods in hospital services, we can give more effective way of inventory management under limited resources and can provide better health care services. The secondary data has been collected from a hospital to give empirical evidence.Keywords: ABC-VED inventory classification, multi item inventory problem, nonlinear programming technique, optimization of EOQ
Procedia PDF Downloads 256