Search results for: multiple linear models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5315

Search results for: multiple linear models

3545 Appraisal of Methods for Identifying, Mapping, and Modelling of Fluvial Erosion in a Mining Environment

Authors: F. F. Howard, I. Yakubu, C. B. Boye, J. S. Y. Kuma

Abstract:

Natural and human activities, such as mining operations, expose the natural soil to adverse environmental conditions, leading to contamination of soil, groundwater, and surface water, which has negative effects on humans, flora, and fauna. Bare or partly exposed soil is most liable to fluvial erosion. This paper enumerates various methods used to identify, map, and model fluvial erosion in a mining environment. Classical, Artificial Intelligence (AI), and GIS methods have been reviewed. One of the many classical methods used to estimate river erosion is the Revised Universal Soil Loss Equation (RUSLE) model. The RUSLE model is easy to use. Its reliance on empirical relationships that may not always be applicable to specific circumstances or locations is a flaw. Other classical models for estimating fluvial erosion are the Soil and Water Assessment Tool (SWAT) and the Universal Soil Loss Equation (USLE). These models offer a more complete understanding of the underlying physical processes and encompass a wider range of situations. Although more difficult to utilise, they depend on the availability and dependability of input data for correctness. AI can help deal with multivariate and complex difficulties and predict soil loss with higher accuracy than traditional methods, and also be used to build unique models for identifying degraded areas. AI techniques have become popular as an alternative predictor for degraded environments. However, this research proposed a hybrid of classical, AI, and GIS methods for efficient and effective modelling of fluvial erosion.

Keywords: Fluvial erosion, classical methods, Artificial Intelligence, Geographic Information System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 185
3544 Convection through Light Weight Timber Constructions with Mineral Wool

Authors: J. Schmidt, O. Kornadt

Abstract:

The major part of light weight timber constructions consists of insulation. Mineral wool is the most commonly used insulation due to its cost efficiency and easy handling. The fiber orientation and porosity of this insulation material enables flowthrough. The air flow resistance is low. If leakage occurs in the insulated bay section, the convective flow may cause energy losses and infiltration of the exterior wall with moisture and particles. In particular the infiltrated moisture may lead to thermal bridges and growth of health endangering mould and mildew. In order to prevent this problem, different numerical calculation models have been developed. All models developed so far have a potential for completion. The implementation of the flow-through properties of mineral wool insulation may help to improve the existing models. Assuming that the real pressure difference between interior and exterior surface is larger than the prescribed pressure difference in the standard test procedure for mineral wool ISO 9053 / EN 29053, measurements were performed using the measurement setup for research on convective moisture transfer “MSRCMT". These measurements show, that structural inhomogeneities of mineral wool effect the permeability only at higher pressure differences, as applied in MSRCMT. Additional microscopic investigations show, that the location of a leak within the construction has a crucial influence on the air flow-through and the infiltration rate. The results clearly indicate that the empirical values for the acoustic resistance of mineral wool should not be used for the calculation of convective transfer mechanisms.

Keywords: convection, convective transfer, infiltration, mineralwool, permeability, resistance, leakage

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2142
3543 Fighter Aircraft Selection Using Fuzzy Preference Optimization Programming (POP)

Authors: C. Ardil

Abstract:

The Turkish Air Force needs to acquire a sixth- generation fighter aircraft in order to maintain its air superiority and dominance against its rivals under the risks posed by global geopolitical opportunities and threats. Accordingly, five evaluation criteria were determined to evaluate the sixth-generation fighter aircraft alternatives and to select the best one. Systematically, a new fuzzy preference optimization programming (POP) method is proposed to select the best sixth generation fighter aircraft in an uncertain environment. The POP technique considers both quantitative and qualitative evaluation criteria. To demonstrate the applicability and effectiveness of the proposed approach, it is applied to a multiple criteria decision-making problem to evaluate and select sixth-generation fighter aircraft. The results of the fuzzy POP method are compared with the results of the fuzzy TOPSIS approach to validate it. According to the comparative analysis, fuzzy POP and fuzzy TOPSIS methods get the same results. This demonstrates the applicability of the fuzzy POP technique to address the sixth-generation fighter selection problem.

Keywords: Fighter aircraft selection, sixth-generation fighter aircraft, fuzzy decision process, multiple criteria decision making, preference optimization programming, POP, TOPSIS, Kizilelma, MIUS, fuzzy set theory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 447
3542 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Petar Penchev

Abstract:

The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.

Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20
3541 A Study on the Secure ebXML Transaction Models

Authors: Dongkyoo Shin, Dongil Shin, Sukil Cha, Seyoung Kim

Abstract:

ebXML (Electronic Business using eXtensible Markup Language) is an e-business standard, sponsored by UN/CEFACT and OASIS, which enables enterprises to exchange business messages, conduct trading relationships, communicate data in common terms and define and register business processes. While there is tremendous e-business value in the ebXML, security remains an unsolved problem and one of the largest barriers to adoption. XML security technologies emerging recently have extensibility and flexibility suitable for security implementation such as encryption, digital signature, access control and authentication. In this paper, we propose ebXML business transaction models that allow trading partners to securely exchange XML based business transactions by employing XML security technologies. We show how each XML security technology meets the ebXML standard by constructing the test software and validating messages between the trading partners.

Keywords: Electronic commerce, e-business standard, ebXML, XML security, secure business transaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748
3540 Instructional Design Practitioners in Malaysia: Skills and Issues

Authors: Irfan N. Umar, Yong Su-Lyn

Abstract:

The purpose of this research is to determine the knowledge and skills possessed by instructional design (ID) practitioners in Malaysia. As ID is a relatively new field in the country and there seems to be an absence of any studies on its community of practice, the main objective of this research is to discover the tasks and activities performed by ID practitioners in educational and corporate organizations as suggested by the International Board of Standards for Training, Performance and Instruction. This includes finding out the ID models applied in the course of their work. This research also attempts to identify the barriers and issues as to why some ID tasks and activities are rarely or never conducted. The methodology employed in this descriptive study was a survey questionnaire sent to 30 instructional designers nationwide. The results showed that majority of the tasks and activities are carried out frequently enough but omissions do occur due to reasons such as it being out of job scope, the decision was already made at a higher level, and the lack of knowledge and skills. Further investigations of a qualitative manner should be conducted to achieve a more in-depth understanding of ID practices in Malaysia

Keywords: instructional design, ID competencies, ID models, IBSTPI

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863
3539 Data Envelopment Analysis under Uncertainty and Risk

Authors: P. Beraldi, M. E. Bruni

Abstract:

Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.

Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
3538 Multivariate Output-Associative RVM for Multi-Dimensional Affect Predictions

Authors: Achut Manandhar, Kenneth D. Morton, Peter A. Torrione, Leslie M. Collins

Abstract:

The current trends in affect recognition research are to consider continuous observations from spontaneous natural interactions in people using multiple feature modalities, and to represent affect in terms of continuous dimensions, incorporate spatio-temporal correlation among affect dimensions, and provide fast affect predictions. These research efforts have been propelled by a growing effort to develop affect recognition system that can be implemented to enable seamless real-time human-computer interaction in a wide variety of applications. Motivated by these desired attributes of an affect recognition system, in this work a multi-dimensional affect prediction approach is proposed by integrating multivariate Relevance Vector Machine (MVRVM) with a recently developed Output-associative Relevance Vector Machine (OARVM) approach. The resulting approach can provide fast continuous affect predictions by jointly modeling the multiple affect dimensions and their correlations. Experiments on the RECOLA database show that the proposed approach performs competitively with the OARVM while providing faster predictions during testing.

Keywords: Dimensional affect prediction, Output-associative RVM, Multivariate regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
3537 Interference Management in Long Term Evolution-Advanced System

Authors: Selma Sbit, Mohamed Bechir Dadi, Belgacem Chibani Rhaimi

Abstract:

Incorporating Home eNodeB (HeNB) in cellular networks, e.g. Long Term Evolution Advanced (LTE-A), is beneficial for extending coverage and enhancing capacity at low price especially within the non-line-of sight (NLOS) environments such as homes. HeNB or femtocell is a small low powered base station which provides radio coverage to the mobile users in an indoor environment. This deployment results in a heterogeneous network where the available spectrum becomes shared between two layers. Therefore, a problem of Inter Cell Interference (ICI) appears. This issue is the main challenge in LTE-A. To deal with this challenge, various techniques based on frequency, time and power control are proposed. This paper deals with the impact of carrier aggregation and higher order MIMO (Multiple Input Multiple Output) schemes on the LTE-Advanced performance. Simulation results show the advantages of these schemes on the system capacity (4.109 b/s/Hz when bandwidth B=100 MHz and when applying MIMO 8x8 for SINR=30 dB), maximum theoretical peak data rate (more than 4 Gbps for B=100 MHz and when MIMO 8x8 is used) and spectral efficiency (15 b/s/Hz and 30b/s/Hz when MIMO 4x4 and MIMO 8x8 are applying respectively for SINR=30 dB).

Keywords: LTE-Advanced, carrier aggregation, MIMO, capacity, peak data rate, spectral efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 905
3536 WPRiMA Tool: Managing Risks in Web Projects

Authors: Thamer Al-Rousan, Shahida Sulaiman, Rosalina Abdul Salam

Abstract:

Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.

Keywords: Architecture pattern model, risk factors, risk identification, web project, web project risk management assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
3535 Smart Power Scheduling to Reduce Peak Demand and Cost of Energy in Smart Grid

Authors: Hemant I. Joshi, Vivek J. Pandya

Abstract:

This paper discusses the simulation and experimental work of small Smart Grid containing ten consumers. Smart Grid is characterized by a two-way flow of real-time information and energy. RTP (Real Time Pricing) based tariff is implemented in this work to reduce peak demand, PAR (peak to average ratio) and cost of energy consumed. In the experimental work described here, working of Smart Plug, HEC (Home Energy Controller), HAN (Home Area Network) and communication link between consumers and utility server are explained. Algorithms for Smart Plug, HEC, and utility server are presented and explained in this work. After receiving the Real Time Price for different time slots of the day, HEC interacts automatically by running an algorithm which is based on Linear Programming Problem (LPP) method to find the optimal energy consumption schedule. Algorithm made for utility server can handle more than one off-peak time period during the day. Simulation and experimental work are carried out for different cases. At the end of this work, comparison between simulation results and experimental results are presented to show the effectiveness of the minimization method adopted.

Keywords: Smart Grid, Real Time Pricing, Peak to Average Ratio, Home Area Network, Home Energy Controller, Smart Plug, Utility Server, Linear Programming Problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685
3534 Identification of the Key Sustainability Issues to Develop New Decision Support Tools in the Spanish Furniture Sector

Authors: P.Cordero, R.Poler, R.Sanchis

Abstract:

The environmental impacts caused by the current production and consumption models, together with the impact that the current economic crisis, bring necessary changes in the European industry toward new business models based on sustainability issues that could allow them to innovate and improve their competitiveness. This paper analyzes the key environmental issues and the current and future market trends in one of the most important industrial sectors in Spain, the furniture sector. It also proposes new decision support tools -diagnostic kit, roadmap and guidelines- to guide companies to implement sustainability criteria into their organizations, including eco-design strategies and other economical and social strategies in accordance with the sustainability definition, and other available tools such as eco-labels, environmental management systems, etc., and to use and combine them to obtain the results the company expects to help improve its competitiveness.

Keywords: Furniture sector, eco-design, sustainability, economical crisis, market trends, roadmap

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1511
3533 Performance Based Design of Masonry Infilled Reinforced Concrete Frames for Near-Field Earthquakes Using Energy Methods

Authors: Alok Madan, Arshad K. Hashmi

Abstract:

Performance based design (PBD) is an iterative exercise in which a preliminary trial design of the building structure is selected and if the selected trial design of the building structure does not conform to the desired performance objective, the trial design is revised. In this context, development of a fundamental approach for performance based seismic design of masonry infilled frames with minimum number of trials is an important objective. The paper presents a plastic design procedure based on the energy balance concept for PBD of multi-story multi-bay masonry infilled reinforced concrete (R/C) frames subjected to near-field earthquakes. The proposed energy based plastic design procedure was implemented for trial performance based seismic design of representative masonry infilled reinforced concrete frames with various practically relevant distributions of masonry infill panels over the frame elevation. Non-linear dynamic analyses of the trial PBD of masonry infilled R/C frames was performed under the action of near-field earthquake ground motions. The results of non-linear dynamic analyses demonstrate that the proposed energy method is effective for performance based design of masonry infilled R/C frames under near-field as well as far-field earthquakes.

Keywords: Masonry Infilled Frame, Energy Methods, Near-fault Ground Motions, Pushover Analysis, Nonlinear Dynamic Analysis, Seismic Demand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2790
3532 Tagged Grid Matching Based Object Detection in Wavelet Neural Network

Authors: R. Arulmurugan, P. Sengottuvelan

Abstract:

Object detection using Wavelet Neural Network (WNN) plays a major contribution in the analysis of image processing. Existing cluster-based algorithm for co-saliency object detection performs the work on the multiple images. The co-saliency detection results are not desirable to handle the multi scale image objects in WNN. Existing Super Resolution (SR) scheme for landmark images identifies the corresponding regions in the images and reduces the mismatching rate. But the Structure-aware matching criterion is not paying attention to detect multiple regions in SR images and fail to enhance the result percentage of object detection. To detect the objects in the high-resolution remote sensing images, Tagged Grid Matching (TGM) technique is proposed in this paper. TGM technique consists of the three main components such as object determination, object searching and object verification in WNN. Initially, object determination in TGM technique specifies the position and size of objects in the current image. The specification of the position and size using the hierarchical grid easily determines the multiple objects. Second component, object searching in TGM technique is carried out using the cross-point searching. The cross out searching point of the objects is selected to faster the searching process and reduces the detection time. Final component performs the object verification process in TGM technique for identifying (i.e.,) detecting the dissimilarity of objects in the current frame. The verification process matches the search result grid points with the stored grid points to easily detect the objects using the Gabor wavelet Transform. The implementation of TGM technique offers a significant improvement on the multi-object detection rate, processing time, precision factor and detection accuracy level.

Keywords: Object Detection, Cross-point Searching, Wavelet Neural Network, Object Determination, Gabor Wavelet Transform, Tagged Grid Matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
3531 Supplier Selection in a Scenario Based Stochastic Model with Uncertain Defectiveness and Delivery Lateness Rates

Authors: Abeer Amayri, Akif A. Bulgak

Abstract:

Due to today’s globalization as well as outsourcing practices of the companies, the Supply Chain (SC) performances have become more dependent on the efficient movement of material among places that are geographically dispersed, where there is more chance for disruptions. One such disruption is the quality and delivery uncertainties of outsourcing. These uncertainties could lead the products to be unsafe and, as is the case in a number of recent examples, companies may have to end up in recalling their products. As a result of these problems, there is a need to develop a methodology for selecting suppliers globally in view of risks associated with low quality and late delivery. Accordingly, we developed a two-stage stochastic model that captures the risks associated with uncertainty in quality and delivery as well as a solution procedure for the model. The stochastic model developed simultaneously optimizes supplier selection and purchase quantities under price discounts over a time horizon. In particular, our target is the study of global organizations with multiple sites and multiple overseas suppliers, where the pricing is offered in suppliers’ local currencies. Our proposed methodology is applied to a case study for a US automotive company having two assembly plants and four potential global suppliers to illustrate how the proposed model works in practice.

Keywords: Global supply chains, quality, stochastic programming, supplier selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
3530 Modeling Biology Inspired Reactive Agents Using X-machines

Authors: George Eleftherakis, Petros Kefalas, Anna Sotiriadou, Evangelos Kehris

Abstract:

Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.

Keywords: Biology inspired agent, formal methods, x-machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1507
3529 Modelling and Analysis of a Robust Control of Manufacturing Systems: Flow-Quality Approach

Authors: Lotfi Nabli, Achraf Jabeur Telmoudi, Radhi M'hiri

Abstract:

This paper proposes a modeling method of the laws controlling manufacturing systems with temporal and non temporal constraints. A methodology of robust control construction generating the margins of passive and active robustness is being elaborated. Indeed, two paramount models are presented in this paper. The first utilizes the P-time Petri Nets which is used to manage the flow type disturbances. The second, the quality model, exploits the Intervals Constrained Petri Nets (ICPN) tool which allows the system to preserve its quality specificities. The redundancy of the robustness of the elementary parameters between passive and active is also used. The final model built allows the correlation of temporal and non temporal criteria by putting two paramount models in interaction. To do so, a set of definitions and theorems are employed and affirmed by applicator examples.

Keywords: Manufacturing systems control, flow, quality, robustness, redundancy, Petri Nets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728
3528 Data-Driven Decision-Making in Digital Entrepreneurship

Authors: Abeba Nigussie Turi, Xiangming Samuel Li

Abstract:

Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.

Keywords: Startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 827
3527 Optimal Location of Multi Type Facts Devices for Multiple Contingencies Using Particle Swarm Optimization

Authors: S. Sutha, N. Kamaraj

Abstract:

In deregulated operating regime power system security is an issue that needs due thoughtfulness from researchers in the horizon of unbundling of generation and transmission. Electric power systems are exposed to various contingencies. Network contingencies often contribute to overloading of branches, violation of voltages and also leading to problems of security/stability. To maintain the security of the systems, it is desirable to estimate the effect of contingencies and pertinent control measurement can be taken on to improve the system security. This paper presents the application of particle swarm optimization algorithm to find the optimal location of multi type FACTS devices in a power system in order to eliminate or alleviate the line over loads. The optimizations are performed on the parameters, namely the location of the devices, their types, their settings and installation cost of FACTS devices for single and multiple contingencies. TCSC, SVC and UPFC are considered and modeled for steady state analysis. The selection of UPFC and TCSC suitable location uses the criteria on the basis of improved system security. The effectiveness of the proposed method is tested for IEEE 6 bus and IEEE 30 bus test systems.

Keywords: Contingency Severity Index, Particle Swarm Optimization, Performance Index, Static Security Assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2766
3526 Software Maintenance Severity Prediction with Soft Computing Approach

Authors: E. Ardil, Erdem Uçar, Parvinder S. Sandhu

Abstract:

As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.

Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, SoftwareFaults, Accuracy, MAE, RMSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
3525 The Application of HLLC Numerical Solver to the Reduced Multiphase Model

Authors: Fatma Ghangir, Andrzej F. Nowakowski, Franck C. G. A. Nicolleau, Thomas M. Michelitsch

Abstract:

The performance of high-resolution schemes is investigated for unsteady, inviscid and compressible multiphase flows. An Eulerian diffuse interface approach has been chosen for the simulation of multicomponent flow problems. The reduced fiveequation and seven equation models are used with HLL and HLLC approximation. The authors demonstrated the advantages and disadvantages of both seven equations and five equations models studying their performance with HLL and HLLC algorithms on simple test case. The seven equation model is based on two pressure, two velocity concept of Baer–Nunziato [10], while five equation model is based on the mixture velocity and pressure. The numerical evaluations of two variants of Riemann solvers have been conducted for the classical one-dimensional air-water shock tube and compared with analytical solution for error analysis.

Keywords: Multiphase flow, gas-liquid flow, Godunov schems, Riemann solvers, HLL scheme, HLLC scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2604
3524 M2LGP: Mining Multiple Level Gradual Patterns

Authors: Yogi Satrya Aryadinata, Anne Laurent, Michel Sala

Abstract:

Gradual patterns have been studied for many years as they contain precious information. They have been integrated in many expert systems and rule-based systems, for instance to reason on knowledge such as “the greater the number of turns, the greater the number of car crashes”. In many cases, this knowledge has been considered as a rule “the greater the number of turns → the greater the number of car crashes” Historically, works have thus been focused on the representation of such rules, studying how implication could be defined, especially fuzzy implication. These rules were defined by experts who were in charge to describe the systems they were working on in order to turn them to operate automatically. More recently, approaches have been proposed in order to mine databases for automatically discovering such knowledge. Several approaches have been studied, the main scientific topics being: how to determine what is an relevant gradual pattern, and how to discover them as efficiently as possible (in terms of both memory and CPU usage). However, in some cases, end-users are not interested in raw level knowledge, and are rather interested in trends. Moreover, it may be the case that no relevant pattern can be discovered at a low level of granularity (e.g. city), whereas some can be discovered at a higher level (e.g. county). In this paper, we thus extend gradual pattern approaches in order to consider multiple level gradual patterns. For this purpose, we consider two aggregation policies, namely horizontal and vertical.

Keywords: Gradual Pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
3523 Application of Turbulence Modeling in Computational Fluid Dynamics for Airfoil Simulations

Authors: Mohammed Bilal

Abstract:

The precise prediction of aerodynamic behavior is necessary for the design and optimization of airfoils for a variety of applications. Turbulence, a phenomenon of complex and irregular flow, significantly affects the aerodynamic properties of airfoils. Therefore, turbulence modeling is essential for accurately predicting the behavior of airfoils in simulations. This study investigates five commonly employed turbulence models: Spalart-Allmaras (SA) model, k-epsilon model, k-omega model, Reynolds Stress Model (RSM), and Large Eddy Simulation (LES) model. The paper includes a comparison of the models' precision, computational expense, and applicability to various flow conditions. The strengths and weaknesses of each model are highlighted, allowing researchers and engineers to make informed decisions regarding simulations of specific airfoils. Unquestionably, the continuous development of turbulence modeling will contribute to further improvements in airfoil design and optimization, which will be advantageous to numerous industries.

Keywords: Computational fluid dynamics, airfoil, turbulence, aircraft.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 281
3522 Analyzing the Impact of Spatio-Temporal Climate Variations on the Rice Crop Calendar in Pakistan

Authors: Muhammad Imran, Iqra Basit, Mobushir Riaz Khan, Sajid Rasheed Ahmad

Abstract:

The present study investigates the space-time impact of climate change on the rice crop calendar in tropical Gujranwala, Pakistan. The climate change impact was quantified through the climatic variables, whereas the existing calendar of the rice crop was compared with the phonological stages of the crop, depicted through the time series of the Normalized Difference Vegetation Index (NDVI) derived from Landsat data for the decade 2005-2015. Local maxima were applied on the time series of NDVI to compute the rice phonological stages. Panel models with fixed and cross-section fixed effects were used to establish the relation between the climatic parameters and the time-series of NDVI across villages and across rice growing periods. Results show that the climatic parameters have significant impact on the rice crop calendar. Moreover, the fixed effect model is a significant improvement over cross-sectional fixed effect models (R-squared equal to 0.673 vs. 0.0338). We conclude that high inter-annual variability of climatic variables cause high variability of NDVI, and thus, a shift in the rice crop calendar. Moreover, inter-annual (temporal) variability of the rice crop calendar is high compared to the inter-village (spatial) variability. We suggest the local rice farmers to adapt this change in the rice crop calendar.

Keywords: Landsat NDVI, panel models, temperature, rainfall.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911
3521 Properties of Bacterial Nanocellulose for Scenic Arts

Authors: B. Suárez, G. Forman

Abstract:

Kombucha (a symbiotic culture of bacteria and yeast) produces material capable of acquiring multiple shapes and textures that change significantly under different environment or temperature variations (e.g., when it is exposed to wet conditions), properties that may be explored in the scenic industry. This paper presents an analysis of its specific characteristics, exploring them as a non-conventional material for arts and performance. Costume Design uses surfaces as a powerful way of expression to represent concepts and stories; it may apply the unique features of nano bacterial cellulose (NBC) as assets in this artistic context. A mix of qualitative and quantitative (interventionist) methodology approaches were used such as review of relevant literature to deepen knowledge on the research topic (crossing bibliography from different fields of studies: biology, art, costume design, etc.); as well as descriptive methods: laboratorial experiments, document quantities, observation to identify material properties and possibilities used to express a multiple narrative ideas, concepts and feelings. The results confirmed that NBC is an interactive and versatile material viable to be used in an alternative scenic context; its unique aesthetic and performative qualities, which change in contact to moisture, are resources that can be used to show a visual and poetic impact on stage.

Keywords: Biotechnological materials, contemporary dance, costume design, nano bacterial cellulose, performing arts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 512
3520 Analytical Development of a Failure Limit and Iso-Uplift Curves for Eccentrically Loaded Shallow Foundations

Authors: N. Abbas, S. Lagomarsino, S. Cattari

Abstract:

Examining existing experimental results for shallow rigid foundations subjected to vertical centric load (N), accompanied or not with a bending moment (M), two main non-linear mechanisms governing the cyclic response of the soil-foundation system can be distinguished: foundation uplift and soil yielding. A soil-foundation failure limit, is defined as a domain of resistance in the two dimensional (2D) load space (N, M) inside of which lie all the admissible combinations of loads; these latter correspond to a pure elastic, non-linear elastic or plastic behavior of the soil-foundation system, while the points lying on the failure limit correspond to a combination of loads leading to a failure of the soil-foundation system. In this study, the proposed resistance domain is constructed analytically based on mechanics. Original elastic limit, uplift initiation limit and iso-uplift limits are constructed inside this domain. These limits give a prediction of the mechanisms activated for each combination of loads applied to the foundation. A comparison of the proposed failure limit with experimental tests existing in the literature shows interesting results. Also, the developed uplift initiation limit and iso-uplift curves are confronted with others already proposed in the literature and widely used due to the absence of other alternatives, and remarkable differences are noted, showing evident errors in the past proposals and relevant accuracy for those given in the present work.

Keywords: Foundation uplift, Iso-uplift curves, Resistance domain, Soil yield.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2175
3519 Structural Parsing of Natural Language Text in Tamil Using Phrase Structure Hybrid Language Model

Authors: Selvam M, Natarajan. A M, Thangarajan R

Abstract:

Parsing is important in Linguistics and Natural Language Processing to understand the syntax and semantics of a natural language grammar. Parsing natural language text is challenging because of the problems like ambiguity and inefficiency. Also the interpretation of natural language text depends on context based techniques. A probabilistic component is essential to resolve ambiguity in both syntax and semantics thereby increasing accuracy and efficiency of the parser. Tamil language has some inherent features which are more challenging. In order to obtain the solutions, lexicalized and statistical approach is to be applied in the parsing with the aid of a language model. Statistical models mainly focus on semantics of the language which are suitable for large vocabulary tasks where as structural methods focus on syntax which models small vocabulary tasks. A statistical language model based on Trigram for Tamil language with medium vocabulary of 5000 words has been built. Though statistical parsing gives better performance through tri-gram probabilities and large vocabulary size, it has some disadvantages like focus on semantics rather than syntax, lack of support in free ordering of words and long term relationship. To overcome the disadvantages a structural component is to be incorporated in statistical language models which leads to the implementation of hybrid language models. This paper has attempted to build phrase structured hybrid language model which resolves above mentioned disadvantages. In the development of hybrid language model, new part of speech tag set for Tamil language has been developed with more than 500 tags which have the wider coverage. A phrase structured Treebank has been developed with 326 Tamil sentences which covers more than 5000 words. A hybrid language model has been trained with the phrase structured Treebank using immediate head parsing technique. Lexicalized and statistical parser which employs this hybrid language model and immediate head parsing technique gives better results than pure grammar and trigram based model.

Keywords: Hybrid Language Model, Immediate Head Parsing, Lexicalized and Statistical Parsing, Natural Language Processing, Parts of Speech, Probabilistic Context Free Grammar, Tamil Language, Tree Bank.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3643
3518 Lane Changing and Merging Maneuvers of Carlike Robots

Authors: Bibhya Sharma, Jito Vanualailai, Ravindra Rai

Abstract:

This research paper designs a unique motion planner of multiple platoons of nonholonomic car-like robots as a feasible solution to the lane changing/merging maneuvers. The decentralized planner with a leaderless approach and a path-guidance principle derived from the Lyapunov-based control scheme generates collision free avoidance and safe merging maneuvers from multiple lanes to a single lane by deploying a split/merge strategy. The fixed obstacles are the markings and boundaries of the road lanes, while the moving obstacles are the robots themselves. Real and virtual road lane markings and the boundaries of road lanes are incorporated into a workspace to achieve the desired formation and configuration of the robots. Convergence of the robots to goal configurations and the repulsion of the robots from specified obstacles are achieved by suitable attractive and repulsive potential field functions, respectively. The results can be viewed as a significant contribution to the avoidance algorithm of the intelligent vehicle systems (IVS). Computer simulations highlight the effectiveness of the split/merge strategy and the acceleration-based controllers.

Keywords: Lane merging, Lyapunov-based control scheme, path-guidance principle, split/merge strategy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
3517 Finite Element Modeling of Heat and Moisture Transfer in Porous Material

Authors: V. D. Thi, M. Li, M. Khelifa, M. El Ganaoui, Y. Rogaume

Abstract:

This paper presents a two-dimensional model to study the heat and moisture transfer through porous building materials. Dynamic and static coupled models of heat and moisture transfer in porous material under low temperature are presented and the coupled models together with variable initial and boundary conditions have been considered in an analytical way and using the finite element method. The resulting coupled model is converted to two nonlinear partial differential equations, which is then numerically solved by an implicit iterative scheme. The numerical results of temperature and moisture potential changes are compared with the experimental measurements available in the literature. Predicted results demonstrate validation of the theoretical model and effectiveness of the developed numerical algorithms. It is expected to provide useful information for the porous building material design based on heat and moisture transfer model.

Keywords: Finite element method, heat transfer, moisture transfer, porous materials, wood.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1280
3516 Assessing Habitat-Suitability Models with a Virtual Species at Khao Nan National Park, Thailand

Authors: W. Srisang, K. Jaroensutasinee, M. Jaroensutasinee

Abstract:

This study examined a habitat-suitability assessment method namely the Ecological Niche Factor Analysis (ENFA). A virtual species was created and then dispatched in a geographic information system model of a real landscape in three historic scenarios: (1) spreading, (2) equilibrium, and (3) overabundance. In each scenario, the virtual species was sampled and these simulated data sets were used as inputs for the ENFA to reconstruct the habitat suitability model. The 'equilibrium' scenario gives the highest quantity and quality among three scenarios. ENFA was sensitive to the distribution scenarios but not sensitive to sample sizes. The use of a virtual species proved to be a very efficient method, allowing one to fully control the quality of the input data as well as to accurately evaluate the predictive power of the analyses.

Keywords: Habitat-Suitability Models, Ecological niche factoranalysis, Climatic factors, Geographic information system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816