Search results for: placement models.
1958 Research on Residential Block Fabric: A Case Study of Hangzhou West Area
Abstract:
Residential block construction of big cities in China began in the 1950s, and four models had far-reaching influence on modern residential block in its development process, including unit compound and residential district in 1950s to 1980s, and gated community and open community in 1990s to now. Based on analysis of the four models’ fabric, the article takes residential blocks in Hangzhou west area as an example and carries on the studies from urban structure level and block spacial level, mainly including urban road network, land use, community function, road organization, public space and building fabric. At last, the article puts forward “Semi-open Sub-community” strategy to improve the current fabric.Keywords: Hangzhou West Area, residential block model, residential block fabric, “Semi-open Sub-community” strategy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14301957 Generative Adversarial Network Based Fingerprint Anti-Spoofing Limitations
Authors: Yehjune Heo
Abstract:
Fingerprint Anti-Spoofing approaches have been actively developed and applied in real-world applications. One of the main problems for Fingerprint Anti-Spoofing is not robust to unseen samples, especially in real-world scenarios. A possible solution will be to generate artificial, but realistic fingerprint samples and use them for training in order to achieve good generalization. This paper contains experimental and comparative results with currently popular GAN based methods and uses realistic synthesis of fingerprints in training in order to increase the performance. Among various GAN models, the most popular StyleGAN is used for the experiments. The CNN models were first trained with the dataset that did not contain generated fake images and the accuracy along with the mean average error rate were recorded. Then, the fake generated images (fake images of live fingerprints and fake images of spoof fingerprints) were each combined with the original images (real images of live fingerprints and real images of spoof fingerprints), and various CNN models were trained. The best performances for each CNN model, trained with the dataset of generated fake images and each time the accuracy and the mean average error rate, were recorded. We observe that current GAN based approaches need significant improvements for the Anti-Spoofing performance, although the overall quality of the synthesized fingerprints seems to be reasonable. We include the analysis of this performance degradation, especially with a small number of samples. In addition, we suggest several approaches towards improved generalization with a small number of samples, by focusing on what GAN based approaches should learn and should not learn.
Keywords: Anti-spoofing, CNN, fingerprint recognition, GAN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5941956 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores
Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay
Abstract:
Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.
Keywords: Retail stores, Faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5841955 Refitting Equations for Peak Ground Acceleration in Light of the PF-L Database
Authors: M. Breška, I. Peruš, V. Stankovski
Abstract:
The number of Ground Motion Prediction Equations (GMPEs) used for predicting peak ground acceleration (PGA) and the number of earthquake recordings that have been used for fitting these equations has increased in the past decades. The current PF-L database contains 3550 recordings. Since the GMPEs frequently model the peak ground acceleration the goal of the present study was to refit a selection of 44 of the existing equation models for PGA in light of the latest data. The algorithm Levenberg-Marquardt was used for fitting the coefficients of the equations and the results are evaluated both quantitatively by presenting the root mean squared error (RMSE) and qualitatively by drawing graphs of the five best fitted equations. The RMSE was found to be as low as 0.08 for the best equation models. The newly estimated coefficients vary from the values published in the original works.
Keywords: Ground Motion Prediction Equations, Levenberg-Marquardt algorithm, refitting PF-L database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14951954 Complex-Valued Neural Networks for Blind Equalization of Time-Varying Channels
Authors: Rajoo Pandey
Abstract:
Most of the commonly used blind equalization algorithms are based on the minimization of a nonconvex and nonlinear cost function and a neural network gives smaller residual error as compared to a linear structure. The efficacy of complex valued feedforward neural networks for blind equalization of linear and nonlinear communication channels has been confirmed by many studies. In this paper we present two neural network models for blind equalization of time-varying channels, for M-ary QAM and PSK signals. The complex valued activation functions, suitable for these signal constellations in time-varying environment, are introduced and the learning algorithms based on the CMA cost function are derived. The improved performance of the proposed models is confirmed through computer simulations.
Keywords: Blind Equalization, Neural Networks, Constant Modulus Algorithm, Time-varying channels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18911953 Fuzzy Control of Macroeconomic Models
Authors: Andre A. Keller
Abstract:
The optimal control is one of the possible controllers for a dynamic system, having a linear quadratic regulator and using the Pontryagin-s principle or the dynamic programming method . Stochastic disturbances may affect the coefficients (multiplicative disturbances) or the equations (additive disturbances), provided that the shocks are not too great . Nevertheless, this approach encounters difficulties when uncertainties are very important or when the probability calculus is of no help with very imprecise data. The fuzzy logic contributes to a pragmatic solution of such a problem since it operates on fuzzy numbers. A fuzzy controller acts as an artificial decision maker that operates in a closed-loop system in real time. This contribution seeks to explore the tracking problem and control of dynamic macroeconomic models using a fuzzy learning algorithm. A two inputs - single output (TISO) fuzzy model is applied to the linear fluctuation model of Phillips and to the nonlinear growth model of Goodwin.Keywords: fuzzy control, macroeconomic model, multiplier - accelerator, nonlinear accelerator, stabilization policy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19941952 Graphic Animation: Innovative Language Learning for Autistic Children
Authors: Norfishah Mat Rabi, Rosma Osman, Norziana Mat Rabi
Abstract:
It is difficult for autistic children to mix with and be around with other people. Language difficulties are a problem that affects their social life. A lack of knowledge and ability in language are factors that greatly influence their behavior, and their ability to communicate and interact. Autistic children need to be assisted to improve their language abilities through the use of suitable learning resources. This study is conducted to identify weather graphic animation resources can help autistic children learn and use transitive verbs more effectively. The study was conducted in a rural secondary school in Penang, Malaysia. The research subject comprised of three autistic students ranging in age from 14 years to 16 years. The 14-year-old student is placed in A Class and two 16-year-old students placed in B Class. The class placement of the subjects is based on the diagnostic test results conducted by the teacher and not based on age. Data collection is done through observation and interviews for the duration of five weeks; with the researcher allocating 30 minutes for every learning activity carried out. The research finding shows that the subjects learn transitive verbs better using graphic animation compared to static pictures. It is hoped that this study will give a new perspective towards the learning processes of autistic children.Keywords: Autistic, graphic animation, language learning, transitive verbs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22231951 A Scatter Search and Help Policies Approaches for a New Mixed Model Assembly Lines Sequencing Problem
Authors: N. Manavizadeh , M. Rabbani , H. Sotudian , F. Jolai
Abstract:
Mixed Model Production is the practice of assembling several distinct and different models of a product on the same assembly line without changeovers and then sequencing those models in a way that smoothes the demand for upstream components. In this paper, we consider an objective function which minimizes total stoppage time and total idle time and it is presented sequence dependent set up time. Many studies have been done on the mixed model assembly lines. But in this paper we specifically focused on reducing the idle times. This is possible through various help policies. For improving the solutions, some cases developed and about 40 tests problem was considered. We use scatter search for optimization and for showing the efficiency of our algorithm, experimental results shows behavior of method. Scatter search and help policies can produce high quality answers, so it has been used in this paper.Keywords: mixed model assembly lines, Scatter search, help policies, idle time, Stoppage time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14911950 Equilibrium and Rate Based Simulation of MTBE Reactive Distillation Column
Authors: Debashish Panda, Kannan A.
Abstract:
Equilibrium and rate based models have been applied in the simulation of methyl tertiary-butyl ether (MTBE) synthesis through reactive distillation. Temperature and composition profiles were compared for both the models and found that both the profiles trends, though qualitatively similar are significantly different quantitatively. In the rate based method (RBM), multicomponent mass transfer coefficients have been incorporated to describe interphase mass transfer. MTBE mole fraction in the bottom stream is found to be 0.9914 in the Equilibrium Model (EQM) and only 0.9904 for RBM when the same column configuration was preserved. The individual tray efficiencies were incorporated in the EQM and simulations were carried out. Dynamic simulation have been also carried out for the two column configurations and compared.
Keywords: Aspen Plus, equilibrium stage model, methyl tertiary-butyl ether, rate based model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49131949 A Control Model for the Dismantling of Industrial Plants
Authors: Florian Mach, Eric Hund, Malte Stonis
Abstract:
The dismantling of disused industrial facilities such as nuclear power plants or refineries is an enormous challenge for the planning and control of the logistic processes. Existing control models do not meet the requirements for a proper dismantling of industrial plants. Therefore, the paper presents an approach for the control of dismantling and post-processing processes (e.g. decontamination) in plant decommissioning. In contrast to existing approaches, the dismantling sequence and depth are selected depending on the capacity utilization of required post-processing processes by also considering individual characteristics of respective dismantling tasks (e.g. decontamination success rate, uncertainties regarding the process times). The results can be used in the dismantling of industrial plants (e.g. nuclear power plants) to reduce dismantling time and costs by avoiding bottlenecks such as capacity constraints.
Keywords: Dismantling management, logistics planning and control models, nuclear power plant dismantling, reverse logistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14511948 Therapeutic Product Preparation Bioprocess Modeling
Authors: Mihai Caramihai, Irina Severin, Ana Aurelia Chirvase, Adrian Onu, Cristina Tanase, Camelia Ungureanu
Abstract:
An immunomodulator bioproduct is prepared in a batch bioprocess with a modified bacterium Pseudomonas aeruginosa. The bioprocess is performed in 100 L Bioengineering bioreactor with 42 L cultivation medium made of peptone, meat extract and sodium chloride. The optimal bioprocess parameters were determined: temperature – 37 0C, agitation speed - 300 rpm, aeration rate – 40 L/min, pressure – 0.5 bar, Dow Corning Antifoam M-max. 4 % of the medium volume, duration - 6 hours. This kind of bioprocesses are appreciated as difficult to control because their dynamic behavior is highly nonlinear and time varying. The aim of the paper is to present (by comparison) different models based on experimental data. The analysis criteria were modeling error and convergence rate. The estimated values and the modeling analysis were done by using the Table Curve 2D. The preliminary conclusions indicate Andrews-s model with a maximum specific growth rate of the bacterium in the range of 0.8 h-1.Keywords: bioprocess modeling, Pseudomonas aeruginosa, kinetic models,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17061947 Boosting Method for Automated Feature Space Discovery in Supervised Quantum Machine Learning Models
Authors: Vladimir Rastunkov, Jae-Eun Park, Abhijit Mitra, Brian Quanz, Steve Wood, Christopher Codella, Heather Higgins, Joseph Broz
Abstract:
Quantum Support Vector Machines (QSVM) have become an important tool in research and applications of quantum kernel methods. In this work we propose a boosting approach for building ensembles of QSVM models and assess performance improvement across multiple datasets. This approach is derived from the best ensemble building practices that worked well in traditional machine learning and thus should push the limits of quantum model performance even further. We find that in some cases, a single QSVM model with tuned hyperparameters is sufficient to simulate the data, while in others - an ensemble of QSVMs that are forced to do exploration of the feature space via proposed method is beneficial.
Keywords: QSVM, Quantum Support Vector Machines, quantum kernel, boosting, ensemble.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4391946 An Examination of the Factors Influencing Software Development Effort
Authors: Zhizhong Jiang, Peter Naudé
Abstract:
Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.
Keywords: Development effort, function points, team size, development language, CASE tool, rapid application development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25081945 On the Mathematical Structure and Algorithmic Implementation of Biochemical Network Models
Authors: Paola Lecca
Abstract:
Modeling and simulation of biochemical reactions is of great interest in the context of system biology. The central dogma of this re-emerging area states that it is system dynamics and organizing principles of complex biological phenomena that give rise to functioning and function of cells. Cell functions, such as growth, division, differentiation and apoptosis are temporal processes, that can be understood if they are treated as dynamic systems. System biology focuses on an understanding of functional activity from a system-wide perspective and, consequently, it is defined by two hey questions: (i) how do the components within a cell interact, so as to bring about its structure and functioning? (ii) How do cells interact, so as to develop and maintain higher levels of organization and functions? In recent years, wet-lab biologists embraced mathematical modeling and simulation as two essential means toward answering the above questions. The credo of dynamics system theory is that the behavior of a biological system is given by the temporal evolution of its state. Our understanding of the time behavior of a biological system can be measured by the extent to which a simulation mimics the real behavior of that system. Deviations of a simulation indicate either limitations or errors in our knowledge. The aim of this paper is to summarize and review the main conceptual frameworks in which models of biochemical networks can be developed. In particular, we review the stochastic molecular modelling approaches, by reporting the principal conceptualizations suggested by A. A. Markov, P. Langevin, A. Fokker, M. Planck, D. T. Gillespie, N. G. van Kampfen, and recently by D. Wilkinson, O. Wolkenhauer, P. S. Jöberg and by the author.
Keywords: Mathematical structure, algorithmic implementation, biochemical network models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15571944 Appraisal of Methods for Identifying, Mapping, and Modelling of Fluvial Erosion in a Mining Environment
Authors: F. F. Howard, I. Yakubu, C. B. Boye, J. S. Y. Kuma
Abstract:
Natural and human activities, such as mining operations, expose the natural soil to adverse environmental conditions, leading to contamination of soil, groundwater, and surface water, which has negative effects on humans, flora, and fauna. Bare or partly exposed soil is most liable to fluvial erosion. This paper enumerates various methods used to identify, map, and model fluvial erosion in a mining environment. Classical, Artificial Intelligence (AI), and GIS methods have been reviewed. One of the many classical methods used to estimate river erosion is the Revised Universal Soil Loss Equation (RUSLE) model. The RUSLE model is easy to use. Its reliance on empirical relationships that may not always be applicable to specific circumstances or locations is a flaw. Other classical models for estimating fluvial erosion are the Soil and Water Assessment Tool (SWAT) and the Universal Soil Loss Equation (USLE). These models offer a more complete understanding of the underlying physical processes and encompass a wider range of situations. Although more difficult to utilise, they depend on the availability and dependability of input data for correctness. AI can help deal with multivariate and complex difficulties and predict soil loss with higher accuracy than traditional methods, and also be used to build unique models for identifying degraded areas. AI techniques have become popular as an alternative predictor for degraded environments. However, this research proposed a hybrid of classical, AI, and GIS methods for efficient and effective modelling of fluvial erosion.
Keywords: Fluvial erosion, classical methods, Artificial Intelligence, Geographic Information System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1851943 Convection through Light Weight Timber Constructions with Mineral Wool
Authors: J. Schmidt, O. Kornadt
Abstract:
The major part of light weight timber constructions consists of insulation. Mineral wool is the most commonly used insulation due to its cost efficiency and easy handling. The fiber orientation and porosity of this insulation material enables flowthrough. The air flow resistance is low. If leakage occurs in the insulated bay section, the convective flow may cause energy losses and infiltration of the exterior wall with moisture and particles. In particular the infiltrated moisture may lead to thermal bridges and growth of health endangering mould and mildew. In order to prevent this problem, different numerical calculation models have been developed. All models developed so far have a potential for completion. The implementation of the flow-through properties of mineral wool insulation may help to improve the existing models. Assuming that the real pressure difference between interior and exterior surface is larger than the prescribed pressure difference in the standard test procedure for mineral wool ISO 9053 / EN 29053, measurements were performed using the measurement setup for research on convective moisture transfer “MSRCMT". These measurements show, that structural inhomogeneities of mineral wool effect the permeability only at higher pressure differences, as applied in MSRCMT. Additional microscopic investigations show, that the location of a leak within the construction has a crucial influence on the air flow-through and the infiltration rate. The results clearly indicate that the empirical values for the acoustic resistance of mineral wool should not be used for the calculation of convective transfer mechanisms.Keywords: convection, convective transfer, infiltration, mineralwool, permeability, resistance, leakage
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21421942 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Petar Penchev
Abstract:
The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.
Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231941 A Study on the Secure ebXML Transaction Models
Authors: Dongkyoo Shin, Dongil Shin, Sukil Cha, Seyoung Kim
Abstract:
ebXML (Electronic Business using eXtensible Markup Language) is an e-business standard, sponsored by UN/CEFACT and OASIS, which enables enterprises to exchange business messages, conduct trading relationships, communicate data in common terms and define and register business processes. While there is tremendous e-business value in the ebXML, security remains an unsolved problem and one of the largest barriers to adoption. XML security technologies emerging recently have extensibility and flexibility suitable for security implementation such as encryption, digital signature, access control and authentication. In this paper, we propose ebXML business transaction models that allow trading partners to securely exchange XML based business transactions by employing XML security technologies. We show how each XML security technology meets the ebXML standard by constructing the test software and validating messages between the trading partners.Keywords: Electronic commerce, e-business standard, ebXML, XML security, secure business transaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17491940 Instructional Design Practitioners in Malaysia: Skills and Issues
Authors: Irfan N. Umar, Yong Su-Lyn
Abstract:
The purpose of this research is to determine the knowledge and skills possessed by instructional design (ID) practitioners in Malaysia. As ID is a relatively new field in the country and there seems to be an absence of any studies on its community of practice, the main objective of this research is to discover the tasks and activities performed by ID practitioners in educational and corporate organizations as suggested by the International Board of Standards for Training, Performance and Instruction. This includes finding out the ID models applied in the course of their work. This research also attempts to identify the barriers and issues as to why some ID tasks and activities are rarely or never conducted. The methodology employed in this descriptive study was a survey questionnaire sent to 30 instructional designers nationwide. The results showed that majority of the tasks and activities are carried out frequently enough but omissions do occur due to reasons such as it being out of job scope, the decision was already made at a higher level, and the lack of knowledge and skills. Further investigations of a qualitative manner should be conducted to achieve a more in-depth understanding of ID practices in MalaysiaKeywords: instructional design, ID competencies, ID models, IBSTPI
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18641939 Data Envelopment Analysis under Uncertainty and Risk
Authors: P. Beraldi, M. E. Bruni
Abstract:
Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19681938 WPRiMA Tool: Managing Risks in Web Projects
Authors: Thamer Al-Rousan, Shahida Sulaiman, Rosalina Abdul Salam
Abstract:
Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.
Keywords: Architecture pattern model, risk factors, risk identification, web project, web project risk management assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15931937 Identification of the Key Sustainability Issues to Develop New Decision Support Tools in the Spanish Furniture Sector
Authors: P.Cordero, R.Poler, R.Sanchis
Abstract:
The environmental impacts caused by the current production and consumption models, together with the impact that the current economic crisis, bring necessary changes in the European industry toward new business models based on sustainability issues that could allow them to innovate and improve their competitiveness. This paper analyzes the key environmental issues and the current and future market trends in one of the most important industrial sectors in Spain, the furniture sector. It also proposes new decision support tools -diagnostic kit, roadmap and guidelines- to guide companies to implement sustainability criteria into their organizations, including eco-design strategies and other economical and social strategies in accordance with the sustainability definition, and other available tools such as eco-labels, environmental management systems, etc., and to use and combine them to obtain the results the company expects to help improve its competitiveness.
Keywords: Furniture sector, eco-design, sustainability, economical crisis, market trends, roadmap
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15111936 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: 'Reddit'
Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell
Abstract:
Native Language Identification is one of the growing subfields in Natural Language Processing (NLP). The task of Native Language Identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL) and then the trained models are evaluated on a different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and Logistic Regression. Results show that content-based features are more accurate and robust than content independent ones when tested within corpus and across corpus.
Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4151935 Modeling Biology Inspired Reactive Agents Using X-machines
Authors: George Eleftherakis, Petros Kefalas, Anna Sotiriadou, Evangelos Kehris
Abstract:
Recent advances in both the testing and verification of software based on formal specifications of the system to be built have reached a point where the ideas can be applied in a powerful way in the design of agent-based systems. The software engineering research has highlighted a number of important issues: the importance of the type of modeling technique used; the careful design of the model to enable powerful testing techniques to be used; the automated verification of the behavioural properties of the system; the need to provide a mechanism for translating the formal models into executable software in a simple and transparent way. This paper introduces the use of the X-machine formalism as a tool for modeling biology inspired agents proposing the use of the techniques built around X-machine models for the construction of effective, and reliable agent-based software systems.
Keywords: Biology inspired agent, formal methods, x-machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15071934 Modelling and Analysis of a Robust Control of Manufacturing Systems: Flow-Quality Approach
Authors: Lotfi Nabli, Achraf Jabeur Telmoudi, Radhi M'hiri
Abstract:
This paper proposes a modeling method of the laws controlling manufacturing systems with temporal and non temporal constraints. A methodology of robust control construction generating the margins of passive and active robustness is being elaborated. Indeed, two paramount models are presented in this paper. The first utilizes the P-time Petri Nets which is used to manage the flow type disturbances. The second, the quality model, exploits the Intervals Constrained Petri Nets (ICPN) tool which allows the system to preserve its quality specificities. The redundancy of the robustness of the elementary parameters between passive and active is also used. The final model built allows the correlation of temporal and non temporal criteria by putting two paramount models in interaction. To do so, a set of definitions and theorems are employed and affirmed by applicator examples.
Keywords: Manufacturing systems control, flow, quality, robustness, redundancy, Petri Nets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17281933 Effect of Testing Device Calibration on Liquid Limit Assessment
Authors: M. O. Bayram, H. B. Gencdal, N. O. Fercan, B. Basbug
Abstract:
Liquid limit, which is used as a measure of soil strength, can be detected by Casagrande and fall-cone testing methods. The two methods majorly diverge from each other in terms of operator dependency. The Casagrande method that is applied according to ASTM D4318-17 standards may give misleading results, especially if the calibration process is not performed well. In this study, to reveal the effect of calibration for drop height and amount of soil paste placement in the Casagrande cup, a series of tests were carried out by multipoint method as it is specified in the ASTM standards. The tests include the combination of 6 mm, 8 mm, 10 mm, and 12 mm drop heights and under-filled, half-filled, and full-filled Casagrande cups by kaolin samples. It was observed that during successive tests, the drop height of the cup deteriorated; hence the device was recalibrated before and after each test to provide the accuracy of the results. Besides, the tests by under-filled and full-filled samples for higher drop heights revealed lower liquid limit values than the lower drop heights revealed. For the half-filled samples, it was clearly seen that the liquid limit values did not change at all as the drop height increased, and this explains the function of standard specifications.
Keywords: Calibration, Casagrande cup method, drop height, kaolin, liquid limit, placing form.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3621932 Analysis of Incidences of Collapsed Buildings in the City of Douala, Cameroon from 2011-2020
Authors: T. G. L. J. Bikoko, J. C. Tchamba, S. Amziane
Abstract:
This study focuses on the problem of collapsed buildings within the city of Douala over the past ten years, and more precisely within the period from 2011 to 2020. It was carried out in a bid to ascertain the real causes of this phenomenon, which has become recurrent in the leading economic city of Cameroon. To achieve this, it was first necessary to review some works dealing with construction materials and technology as well as some case histories of structural collapse within the city. Thereafter, a statistical study was carried out on the results obtained. It was found that the causes of building collapses in the city of Douala are: Neglect of administrative procedures, use of poor quality materials, poor composition and confectioning of concrete, lack of Geotechnical study, lack of structural analysis and design, corrosion of the reinforcement bars, poor maintenance in buildings, and other causes. Out of the 46 cases of failure and collapse of buildings within the city of Douala, 7 of these were identified to have had no geotechnical study carried out, giving a percentage of 15.22%. It was also observed that out of the 46 cases of structural failure, 6 were as a result of lack of proper structural analysis and design giving a percentage of 13.04%. Subsequently, recommendations and suggestions are made in a bid to placing particular emphasis on the choice of materials, the manufacture and casting of concrete as well as the placement of the required reinforcements. All this guarantees the stability of a building.
Keywords: collapse buildings, Douala, structural collapse, Cameroon
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8551931 Data-Driven Decision-Making in Digital Entrepreneurship
Authors: Abeba Nigussie Turi, Xiangming Samuel Li
Abstract:
Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.
Keywords: Startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8271930 Developing Kazakh Language Fluency Test in Nazarbayev University
Authors: Saule Mussabekova, Samal Abzhanova
Abstract:
The Kazakh Language Fluency Test, based on the IELTS exam, was implemented in 2012 at Nazarbayev University in Astana, Kazakhstan. We would like to share our experience in developing this exam and some exam results with other language instructors. In this paper, we will cover all these peculiarities and their related issues. The Kazakh Language Fluency Test is a young exam. During its development, we faced many difficulties. One of the goals of the university and the country is to encourage fluency in the Kazakh language for all citizens of the Republic. Nazarbayev University has introduced a Kazakh language program to assist in achieving this goal. This policy is one-step in ensuring that NU students have a thorough understanding of the Kazakh language through a fluency test based on the International English Language Testing System (IELTS). The Kazakh Language Fluency Test exam aims to determine student’s knowledge of Kazakh language. The fact is that there are three types of students at Nazarbayev University: Kazakh-speaking heritage learners, Russian-speaking and English-speaking students. Unfortunately, we have Kazakh students who do not speak Kazakh. All students who finished school with Russian language instruction are given Kazakh Language Fluency Test in order to determine their Kazakh level. After the test exam, all students can choose appropriate Kazakh course: Basic Kazakh, Intermediate Kazakh and Upper-Intermediate Kazakh. The Kazakh Language Fluency Test consists of four parts: Listening, Reading, Writing and Speaking. They are taken on the same day in the abovementioned order.
Keywords: Diagnostic language test, Kazakh language, placement test, test result.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9771929 Software Maintenance Severity Prediction with Soft Computing Approach
Authors: E. Ardil, Erdem Uçar, Parvinder S. Sandhu
Abstract:
As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, SoftwareFaults, Accuracy, MAE, RMSE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581